unhosted web apps
- Inicie sesión o regístrese para enviar comentarios
This looks like a possible solution to the "careless computing" problem:
http://unhosted.org/
I found this explanation easier to follow:
https://wiki.p2pfoundation.net/Unhosted
"Unhosted apps are web applications able to run locally in your browser – because they are pure JavaScript, like many web apps already. You identify with your user address which then connects your remote storage to the app, loads your data and decrypts it locally – all in your browser, nothing leaking to the app server.
"This makes it easier and highly secure for users: You have your data in one place, like a »My Documents« folder that you can use with web apps. And you don’t need to get a separate account for every application you use, nor export and import it over – you only remember your storage user address. Your data is not being snooped or encumbered in proprietary platforms. It also makes it easier for app developers because they neither have to worry about hosting all the data and user accounts nor about server load – all the computing takes place in your browser. With the app being just JavaScript it becomes very easy to develop and deploy new apps which everyone can use. Technically speaking, we define a standard combining things like WebFinger, OAuth, WebDAV, Cross-Origin Resource Sharing (CORS), preferably BrowserID and ideally with end-to-end-encryption on top." (NextNet mailing list, October 2011)"
"Manifesto for the unhosted web
Freedom from Web2.0's monopoly platforms.
Free/libre and Open Source Software (FLOSS) frees us from having to install proprietary software on our terminals. But installable software is losing ground to hosted software (websites). The server software is often open source (e.g. LAMP), but the website itself as a software product is almost always proprietary. There is an obvious reason for this: Even if an Affero license allows us to download the website's source code, only a commercial company can finance the thousands of servers needed to host a successful website. To make things worse, hosted software has more power over its users than installable software, because it forces you to put your user data on servers owned by the same company that publishes the software. If you want to use Google Docs, you have to reveal your work to a Google-owned server (what Richard Stallman calls "careless computing").
Unhosted
I left my dayjob and started the UNHOSTED project to try and stop this. We needed to break the one-to-one link between the software publisher who writes a website (e.g. "Google, Inc") and the "hostage provider" who hosts that website (e.g. also "Google, Inc"). Unhosted creates a simple grease layer in the form of an open web standard (UJ/0.1) between the hosted software and the servers that host it, so this is decoupled."
I know people here have some firm opinions about the pros and cons of javascript and what is required to solve the Javascript Trap. Curious to know what you all think about this approach.
Only so long as these programs written in JavaScript are free. Even then metadata still gets leaked like how many times you go to the site, when, where from, and etc. And if they're truly running only on the person's computer then it seems clunky to be getting them from a web browser in the first place. May as well be converted to C or some other "proper" language (LOL) and installed via the package manager. Then truly no one knows when or how often the program is used.
Or even just ran in a local JavaScript interpreter/Web browser that doesn't connect to the Internet.
Yes, in cases of "Service as a Software Substitute", a program running on the local computer could provide all the same functions using a "proper" language (as jxself says). To clarify, my understanding is that the 'unhosted web apps' proposal covers those situations where the internet is required, and where a server is normally used, examples like:
* webmail (eg RoundCube)
* automated off-site backs and multi-device file sync (eg NextCloud)
* online event calendars for public display and data-sharing via WebDAV/ iCAL (eg Cozy)
* groupware and chat platforms (eg Loomio, MatterMost, CoActivate)
Now in an ideal world, everyone would grow all their own fruit and vegetables in their backyard and run their own servers for their home/ office, running free code software, to provide for these needs. Needless to say, this is not currently the world we live in, and significant upskilling would be needed to make it possible. Most of the older generation will simply never have these skills, and must depend either on online spyware like Google et al, or on trusted geeks running servers for them.
The proposal for unhosted web apps (yes, free code is part of the proposal) is to reduce the degree to which the geek running the server must be trusted, as well as to reduce centralization and single-points-of-failure in these services (eg allowing separation of software/ processing and storage).
The advantage of using the browser is that the same software will work at the user end regardless of the underlying platform, presuming the browser and the software properly implement the standards for HTML, CSS etc. This massively reduces the amount of work involved in making the apps user-friendly and cross-platform.
JavaScript applications presented as remote Web pages are convenient, sure. But their convenience is also their greatest danger. You know how Windows users can get infected with unauthorized malware just by visiting the wrong Web pages? That's JavaScript; on a sane computing system, you can choose explicitly what programs to install and run, but in the Web browser, everything is installed and ran automatically, and that tends to include programs that are malicious.
Then there's the problem of updates to the application being automatic: you may have reviewed the source code of on version 5 minutes ago, but that doesn't mean that's what you're getting now. On a sane computing system, you keep the program as long as you consistently use it, and you can control how often it is updated. In a Web browser, it's impossible to even tell if a script is changed, so updates are not only automatic, but silent. If someone wanted to attack you, it would be trivially easy for the Web server (or a middle-man if it's over HTTP) to send you a malicious version of the program (say, a version that contains spyware) for a brief period of time, then revert to the non-malicious version a few minutes later.
And the real kicker is, you can't edit JavaScript programs distributed as a part of a remote Web page. You can edit the program, sure, but you can't actually use your modified version in place of whatever version the Web page distributes/requests.
I've written about the problem of JavaScript embedded into Web pages in the past:
https://onpon4.github.io/other/kill-js/
In short, executing JavaScript, at least the way all Web browsers do it today, distributed with remote Web pages should never be done. It is incompatible both with true liberty and with even modest security. The only way it can be acceptable is if a Web browser is designed to give the user full control over what scripts are executed, what those scripts do, and when they are updated.
Of course, this all applies only to remote Web pages. If you have a local Web page (stored entirely on your computer), then any JavaScript program on it is perfectly fine (as long as it is libre and non-malicious). But that kind of defeats the whole purpose of basing a program on manipulation of a Web page.
Perhaps one can use the adapted version by using GreaseMonkey.
I'm using GNU LibreJS to block all non-free JS from YouTube and also I'm
using GreaseMonkey with an YouTube video player for which I forgot the
name now, and it works as expected.
One other alternative, that needs to be implemented by the site
developer, is to make a site not load any JavaScript, and instead, ask
the users to download a GreaseMonkey script instead. Perhaps this would
allow the users to keep different versions or even revert versions of
said script (since GreaseMonkey seems to have a package manager of its
own, although the process is manual, since you have to visit the
@updateURL, from there get the desired older version, and revert the
installed version by installing the older one and disabling automatic
updates only of said script).
> Perhaps one can use the adapted version by using GreaseMonkey.
Sort of, but the Greasemonkey API is completely different from the normal API, and Greasemonkey doesn't guarantee to stop the execution of existing JavaScript, so all JavaScript would have to be converted, something overly burdensome that no one can realistically do.
At best, user scripts can make some JavaScript dependent websites work to some degree (like ViewTube on YouTube). Only on Firefox, though; I haven't seen any other browser where user scripts work if you disable normal JavaScript execution.
> One other alternative, that needs to be implemented by the site
> developer, is to make a site not load any JavaScript, and instead, ask
> the users to download a GreaseMonkey script instead.
Yes, that would be perfectly acceptable. But it's never going to happen that way. It would be much easier to just make the website not require JavaScript in the first place.
For blocking JavaScript, this is why we have GNU LibreJS and also
NoScript.
NoScript doesn't support the kind of fine-grained control that would be needed to selectively use user script variants of certain scripts; it only supports whitelisting domains. LibreJS is even worse; it doesn't even have a setting to not automatically execute any script that it detects as libre.
This isn't the core problem, though. The core problem is that making a user script that works the same as a regular script is a conversion effort, so trivial changes to the script are not actually trivial to make.
- Inicie sesión o regístrese para enviar comentarios