Defining the Open Web and Silos

created Oct 7, 2016 - updated Aug 23, 2017

(This long page needs pared down.)

The “open web” and “silos”, what do they mean? Most internet users don’t care, and they probably don’t have a reason to be concerned. Those terms can be confusing to the few people who do care about the differences.

Here’s a 2010 post at tantek.com, titled What is the Open Web?.

tantek started or helped to start the Indieweb movement in 2010. In that 2010 post, he wrote about the open web:

In summary:

  • open content and application publishing
  • open ability to code and implement the standards that such content depends on
  • open access to content, web-applications, web standards implementations (browsers), and the internet.

About the IndieWeb http://indieweb.org/indieweb

The IndieWeb is about owning your domain, using it as your primary identity to publish on your own site (optionally syndicate elsewhere), and owning your data.

The above philosophy represents my definition of the open web too. The IndieWeb advocates additional functionality that’s built upon the open web. The IndieWeb suggests a cross-site commenting and publication system called Webmention, an open way to publish between clients and servers called Micropub, logging into websites by using IndieAuth, and other concepts.


June 2016 The Atlantic story

The open web is the nickname for the internet as it should be—free, uncensorable, and independently owned and operated. According to the blog posts that hashed out most of its theory (and which themselves were published on the open web), the open web describes an internet where people mostly publish their writing (or music, or photos, or films) to servers that they own or rent, accessible via their own personal domain names, in formats that are themselves free or unrestricted.

It is the web because the pages are written in HTML and CSS; it is open because anyone can access almost all of it, without special privileges, expenditures, or a user account. Above all, the open web is free—free like language is free, like consciousness is free. Freedom not so much as a right, but as a technical and inalienable fact.


May 2017 - The Verge - And now, a brief definition of the web

To count as being part of the web, your app or page must:

  1. Be linkable, and

  2. Allow any client to access it.

That’s it. Okay, not really. There are a lot of details to get into here, specifically with the second point.

But let’s tackle the first, because it’s easy. Whatever it is you’re publishing should be linkable: it should have a URL that other things can point to.

For the web, that rule is pretty clear: whether you use Chrome or Safari or Edge or Opera or whatever, when you click a link or type in a URL, you get the page you wanted (more or less). Those pages are agnostic to the client.

That agnosticism isn’t easy; it’s driven by web standards and the W3C organization that crafts them.

It can get messy, though, especially with technologies like Instant Articles or AMP. Particularly AMP, which is build on a subset of the same HTML stack that powers the web.

… the real issue for AMP being part of the web is that Google defines the terms of what does and doesn’t work on AMP and could limit it whenever it likes.


May 2017 - The Register - Kill Google AMP before it KILLS the web

Google’s AMP is bad — bad in a potentially web-destroying way. Google AMP is bad news for how the web is built, it’s bad news for publishers of credible online content, and it’s bad news for consumers of that content. Google AMP is only good for one party: Google.

So far AMP actually sounds appealing. Except that, hilariously, to create an AMP page you have to load a, wait for it, yes a JavaScript file from Google. Pinboard founder Maciej Cegłowski already recreated the Google AMP demo page without the Google AMP JavaScript and, unsurprisingly, it’s faster than Google’s version.

So it’s not really about speed. As with anything that eschews standards for its own modified version thereof, it’s about lock-in. Tons of pages in Google AMP markup mean tons of pages that are optimized specifically for Google and indexed primarily by Google and shown primarily to Google users. It’s Google’s attempt to match Facebook’s platform. And yes, Facebook is far worse than AMP, but that doesn’t make Google AMP a good idea. At least Facebook doesn’t try to pretend like it’s open.


May 2017 Daring Fireball post about the above story.

I’m on the record as being strongly opposed to AMP simply on the grounds of publication independence. I’d stand by that even if the implementation were great. But the implementation is not great — it’s terrible. Yes, AMP pages load fast, but you don’t need AMP for fast-loading web pages. If you are a publisher and your web pages don’t load fast, the sane solution is to fix your fucking website so that pages load fast, not to throw your hands up in the air and implement AMP.


January 2017 - danielmiessler.com - Google AMP is Not a Good Thing

The entire point of the [web] was to link to things. You create something. We link to it. I create something. We link to it. That’s why they called it a web.

But now we’re seeing these new models that are looking to break the system. Create central platforms and get everyone using it. Build a search engine and display people’s content without passing the user through to it. It’s poisonous to the underlying concept of an open [web].

So, short version, AMP is bad. It’s an attack on the core principle of net, inter, web, and all the other metaphorical terms that we think of when we imagine the internet. They all mean connectedness. To each other.


May 2017 scripting.com post

I don’t know what your privacy settings are. So if I point to your post, it’s possible a lot of people might not be able to read it.

Facebook seems solid now, but they could go away or retire the service you posted on. Deprecate the links. Who knows. You might not even mind, but I do. I like my archives to last as long as possible.

Get a blog. If your ideas have any value put them on the open web. Facebook is trying to kill it. Trust me you will hate yourself if they succeed. Same with Google.


June 2017 Daring Fireball post

You might think it’s hyperbole for Winer to say that Facebook is trying to kill the open web. But they are. I complain about Google AMP, but AMP is just a dangerous step toward a Google-owned walled garden — Facebook is designed from the ground up as an all-out attack on the open web.

The original post by Marc Haynes was public, which I know because I do not have a Facebook account …

Marc Haynes’s Facebook post about Roger Moore is viewable by anyone, but:

It is not accessible to search engines. Search for “Marc Haynes Roger Moore” on any major search engine — DuckDuckGo, Google, Bing — and you will get hundreds of results. The story went viral, deservedly. But not only is the top result not Haynes’s original post on Facebook, his post doesn’t show up anywhere in the results because Facebook forbids search engines from indexing Facebook posts.

Content that isn’t indexable by search engines is not part of the open web. Facebook forbids The Internet Archive from saving a copy of posts. The only way to find Facebook posts is through Facebook.

The Internet Archive is our only good defense against broken links. Blocking them from indexing Facebook content is a huge “fuck you” to anyone who cares about the longevity of the stuff they link to.

Treat Facebook as the private walled garden that it is. If you want something to be publicly accessible, post it to a real blog on any platform that embraces the real web, the open one.


June 2017 joecieplinski.com post

Look, I get that I’m the nut who doesn’t want to use Facebook. I’m not even saying don’t post your stuff to Facebook. But if Facebook is the only place you are posting something, know that you are shutting out people like me for no good reason. Go ahead and post to Facebook, but post it somewhere else, too. Especially if you’re running a business.

The number of restaurants, bars, and other local establishments that, thanks to crappy web sites they can’t update, post their daily specials, hours, and important announcements only via Facebook is growing. That’s maddening. Want to know if we’re open this holiday weekend? Go to Facebook. Go to hell.

It’s 2017. There are a million ways to get a web site set up inexpensively that you can easily update yourself. Setting up a Facebook page and letting your web site rot, or worse, not even having a web site of your own, is outsourcing your entire online presence. That’s truly insane. It’s a massive risk to your business, and frankly, stupid.

I don’t blame small business owners

More than a few people are frustrated about how the open web is used and abused, but that’s because alleged user-friendly services like Facebook provide the tools and networking effect that save small business owners time and effort.

Unfortunately, the open web is harder to use, compared to managing a Facebook account or a group page. On the open web, it’s harder to maintain a server and the web publishing software, and it’s harder to get noticed without relying somewhat on the social media giants.

Small business owners have little room for error. Should they learn web history and maintain their own server and website to support a cause, or should they do whatever is easiest to generate the most revenue?

It’s easy for someone like me to sit on the sidelines and ankle bite a small business owner for not owning a domain name and using it to host a website. The open web activists won’t generate as much revenue as the social media crowd.

But I’m still allowed to complain. It’s up to the open web activists and technologists, such as the IndieWeb community, to build tools to make it easy for non-tech people to use the open web in conjunction with using social media. The IndieWeb community definitely leads the way on this issue, but it will take time, a long time, to make the open web or the IndieWeb work easily for everyone.

And the IndieWeb is not a replacement for social media. Nobody is trying to “kill off” anything. Most people involved in the IndieWeb community make heavy use of social media, but they do so from their own websites. All of their social media content got posted first on their own personal websites. And any interactions on social media by others come back to their personal websites.

Server-hosted solution

When hosting a website at Digital Ocean, Dreamhost, Amazon Web Services, and any other place where the site owner pays a fee for the server space, then I call this a server-hosted solution.

With a server-hosted setup, the site owner has access to the server’s command-line with the ability to install nearly any server software that the site owner desires.

The site owner must install the web publishing software. The site owner will be responsible for securing, configuring, and updating the operating system, the CMS, and any other software, such as a database server and a caching server. DNS will need to be managed too.

With this setup, the site owner can replace any software, but hopefully, the permalinks remain intact. The site owner can customize the server code and how the website displays in the browser with nearly no restrictions.

The server-hosted solution provides incredible flexibility, but it requires more technical expertise. And most web publishers do not want to be sys admins, programmers, designers, and security gurus.

https://indieweb.org/admin_tax

admin tax is all the time you spend maintaining your personal site, rather than actually using it (like to create posts).

Examples of admin tax:

  • DBA tax
  • renewing your domain names
  • paying bills (for web hosting, domain registrars)
  • updating software
  • removing spam
  • blocking spammers

CMS-hosted solution

Tumblr, Blogger, Medium, Svbtle, Wordpress.com, and Ghost.com are examples of a CMS-hosted solution. The site owner does not have command-line access to the server, hosting the site.

Some CMS-hosted solutions permit the site owner to install custom plug-ins that provide additional server-side functions and provide differeent ways to display content in the browser.

But limitations exist. The server software cannot be modified. The site owner cannot swap out one CMS for another without changing CMS providers. Authors are bound by the options made available by the CMS provider, including whether the provider remains in business.

But for many web publishers, the restrictions that come from using a CMS-hosted solution are less important than the benefits of not having to worry about securing and updating the server software.

With a CMS-hosted solution, authors can focus on producing content.


https://chat.indieweb.org/2017-07-11

I learned something new in the above chat log, and it’s a possible concern for using a CMS-hosted solution, such as Tumblr.

Apparently, Tumblr may deactivate or recycle a username (name.tumblr.com) if the owner has not posted in a while.

Tumblr sent an email like this to a user:

It’s been a while since you’ve been on Tumblr, and we wanted to make sure that you’re still interested in using the username [username]. If so, just hit this button:

If not, you don’t have to do anything. If we don’t hear from you within two weeks, we’ll just give you a temporary username and release your old one back into the wild.

You can come back and change your temporary name to whatever you want, whenever you’re ready. Your content will all still be here when you get back.

That’s not a sturdy, longevity solution. The permalinks would be broken.

Tumblr provides domain name mapping. Since the open web/IndieWeb encourages a user to buy a domain name, then I assume that the above scenario is not a problem if the domain name points to a Tumblr blog. This might not be a good place, however, to make an assumption. Again, that’s a drawback with using a CMS-hosted solution.


July 2017 - boffosocko.com - The Facebook Algorithm Mom Problem

A silo like Facebook controls what users see. The users have less control than Facebook’s algorithm.

For quite a while now, I’ve been publishing most of my content to my personal website first and syndicating copies of it to social media silos like Twitter, Instagram, Google+, and Facebook. Within the Indieweb community this process is known as POSSE an acronym for Post on your Own Site, Syndicate Elsewhere.

Anecdotally most in social media have long known that doing this type of workflow causes your content to be treated like a second class citizen, particularly on Facebook which greatly prefers that users post to it manually or using one of its own apps rather than via API.

This means that the Facebook algorithm that decides how big an audience a piece of content receives, dings posts which aren’t posted manually within their system. Simply put, if you don’t post it manually within Facebook, not as many people are going to see it.

It’s a lengthy and interesting post about Chri’s experience with how the Facebook algorithm controls his audience.

If his Mom likes one of his posts too soon, then Facebook displays his post mainly to close family.

More from Chris:

I can post about arcane areas like Lie algebras or statistical thermodynamics, and my mom, because she’s my mom, will like all of it–whether or not she understands what I’m talking about.

The problem is: Facebook, despite the fact that they know she’s my mom, doesn’t take this fact into account in their algorithm.

What does this mean? It means either I quit posting to Facebook, or I game the system to prevent these mom-autolikes.

Facebook allows users to specifically target their audience in a highly granular fashion from the entire public to one’s circle of “friends” all the way down to even one or two specific people. Even better, they’ll let you target pre-defined circles of friends and even exclude specific people. So this is typically what I’ve been doing to end-around my Facebook Algorithm Mom problem. I have my site up set to post to either “Friends except mom” or “Public except mom”. This means that my mom now can’t see my posts when I publish them!

I come back at the end of the day after the algorithm has run its course and my post has foreseeably reached all of the audience it’s likely to get. At that point, I change the audience of the post to completely “Public”.

You’ll never guess what happens next … Yup. My mom “likes” it! Even better, I’m happy to report that generally the intended audience which I wanted to see the post actually sees it. Mom just gets to see it a bit later.

That’s an interesting world, Facebook. But it’s one that is hard for me to grasp because I’m an old web person. I created my first HTML pages and Common Gateway Interface programs in 1996. I like message boards with flat commenting systems. I like RSS and other feed formats. I like maintaining a list of websites to visit.

I cannot imagine not seeing the content that I expect to see. It makes zero sense to me.


Tools like Granary, Indigenous, and InkStone are great pieces of the puzzle, as are open source CMS’s like Known and WordPress with support for Micropub, Webmention, and other IndieWeb building blocks. But, the reason that silos like Facebook, Twitter, and Instagram are popular is that they provide a convenient, easy, and attractive unified experience for content consumption, content creation, and interactions. In order to be successful, and drive mass adoption, the IndieWeb must provide a user experience on par with silos on all three of these fronts.

Between RSS and Atom, Webmention, and Micropub, the building blocks are there to create such an experience in a decentralized way, with participants in the network owning their own domains, websites, and data, pulling in content from a variety of sources via feeds, and creating posts, reactions, and interactions to their own sites with notifications to other participant sites.

Today, most people’s experience of the web is through algorithmically generated, ad-supported timelines like Twitter and Facebook. Frequently, its on mobile devices in the native app clients for these silos, rather than through a web browser. That’s really a shame.

These algorithmically curated timelines are filling the gap that feed readers and aggregators like Google Reader left open. Web browsers have also ceded ground to silos, focusing purely on navigation, tab management, and search, rather than thinking about the bigger picture.

The ideal solution to this problem would be a native application for desktop operating systems and mobile platforms that places user experience at the forefront, and provides:

Content consumption for both the open web, through RSS/Atom, and silos like Twitter and Facebook in separate tabs or timelines.

Content creation for both the open web, through Micropub, and silos like Twitter and Facebook via syndication or their APIs.

Rich interactions for both the open web, through Webmention, and silos like Twitter and Facebook via their APIs.

A unified experience for the rebirth of the open web is a massive market opportunity. The building blocks are there. History has shown that these kinds of experiences can become massively popular and drive innovation.


Over 400 comments exist in that Hacker News thread. Here are excerpts from the top comment with my emphasis added:

Make your website of record your website. Make social media platforms and others (e.g. Google) secondary to that. Don’t let Google and Facebook control how you build your website. I am amazed at companies that take their websites and subjugate them to their Facebook page. You may gain social attention but you are handing over control.

Never, ever, ever say to contact me go to facebook.com/xxxx or my email address is xxx@gmail.com. Your site is yoursite.com and your email is youremail.com. Your login to the sites you build are email addresses, not tied to social media providers.

The closed internet providers are enhancements to your sites. They do not take the place of your site. If you follow this philosophy, you are supporting the open internet. Own your .com. Don’t let others own you by taking that from you.


HN comment:

It really depends on your pages.

If you are building a blog or a news site (a classic “content” site), it should absolutely work without javascript.

If you are building an admin interface for said news site, where articles are edited and published, it is okay to require javascript.

The sonniesedge.co.uk author concluded the post with:

I maintain that it’s perfectly possible to use the web without javascript, especially on those sites that are considerate to the diversity of devices and users out there. And if I want to browse the web without javascript, well fuck, that’s my choice as a user. This is the web, not the Javascript App Store, and we should be making sure that things work on even the most basic device.

Pages that are empty without JS: dead to history (archive-org), unreliable for search results (despite any search engine claims of JS support, check it yourself), and thus ignorable. No need to waste time reading or responding.

Also known as, if it’s not curlable, it’s not on the web.

https://indiewebcamp.com/curlable

Because in 10 years nothing you built today that depends on JS for the content will be available, visible, or archived anywhere on the web.

All your fancy front-end-JS-required frameworks are dead to history, a mere evolutionary blip in web app development practices. Perhaps they provided interesting ephemeral prototypes, nothing more.


Making matters worse, these social silo readers have typically, if not uniformly, turned off all external access to their own RSS feeds long ago.

If you want to read content in Facebook, you have to log in and have an account and participate there directly, you cannot just subscribe to five peoples’ content via RSS and read it anywhere you want.

This monopolistic behavior is exactly the reason we call them silos. Content goes in, but doesn’t come back out.


In my open, it’s an anti-open web situation when web authors cannot customize their websites or individual article pages in any manner that they desire. This is more likely to be a problem with CMS-hosted setups.

Medium also differs from earlier blogging services in a significant, contrarian way: it offers you, the writer, nearly zero options for the presentation of your stories. No matter what kind of story you write, or who your readers are, it gets packaged into a single, non-negotiable template.

As a fan of minimalism, however, I think that term is misapplied here. Minimalism doesn’t foreclose either expressive breadth or conceptual depth. On the contrary, the minimalist program—as it initially emerged in fine art of the 20th century—has been about diverting the viewer’s attention from overt signs of authorship to the deeper purity of the ingredients.

If that’s the case, we can’t say that Medium et al. are offering minimalist design. Only the veneer is minimalist. What they’re really offering is a shift from design as a choice to design as a constant [and constraint]. Instead of minimalist design, a better term might be homogeneous design.

On the other hand, a necessary side effect of Medium’s homogeneous design is that every story looks the same. If you agree that the role of typography is to enhance the text for the benefit of the reader (as I contend in who is ty­pog­ra­phy for?), then it stands to reason that different texts demand distinct ty­pog­ra­phy.

Among web-publishing tools, I see Medium as the equivalent of a frozen pizza: not as wholesome as a meal you could make yourself, but for those without the time or motivation to cook, a potentially better option than just eating peanut butter straight from the jar.

If you want to be part of something open and democratic, use open-source software. If you want to have your writing look great, learn something about ty­pog­ra­phy (or hire a designer). I prefer web publishing despite its shortcomings, but if you don’t, then make an e-book or PDF and distribute it yourself.

As writers, we don’t need companies like Medium to tell us how to use the web. Or define openness and democracy. Or tell us what’s a “waste of [our] time” and what’s not. Or determiner how and where readers experience our work. We need to decide those things for ourselves.

On July 12, 2017, during the Day of Action for Net Neutrality, The Ringer, which at the time was hosted at medium.com, could not customize their website to show support for Net Neutrality. They could not create a blackout page or something related to the day.

Yet, somewhat ironically, a writer at The Ringer, whined about tech companies not doing more to show support for Net Neutrality.

Tech companies are afraid to disrupt the user experience to incite political action.

… many of the biggest consumer tech firms of our time are doing considerably less than they could (and have done in the past) to keep this issue top of mind.

And what did The Ringer do on July 12, 2017 to show support for Net Neutrality? Nothing because it couldn’t do anything because it relies on a CMS-hosted solution at medium.com that provides The Ringer with few customization options.

This might be a an example of how a CMS-hosted solution, like medium.com, does not support the open web because the site owner cannot engage in an activist-led website blackout or to make changes to the site for any reason.

In late July or early August 2017, The Ringer completed its move off of medium.com. It’s now hosted on Vox’s Chorus platform, which provides greater flexibility for site control.

More spitballing

Some of my thoughts expressed in this section were mentioned above.

To me, the open web means hosting content on one’s own domain name, and the content is publicly available to all with an internet connection by using a web browser or command prompt tools, such as cURL.

And ideally, the website owners would fund their own serving hosting setups. This would be a server-hosted solution, instead of a CMS-hosted solution. The former provides more flexibility. But the server-hosted solution requires more computer tech skills than a CMS-hosted setup.

Websites that support the open web should use progressive enhancement to allow their content to be available when JavaScript is disabled or when their article pages are accessed via cURL or a text-based browser, such as Lynx. The content should be available for devices that help the visually impaired.

The open web means publishers have the ability to change the look of their websites to be anything that they want. No limitations on theming.

Some CMS-hosted solutions provide enough customization options on the server and for the browser display to satisfy most authors. But the server-hosted solution permits maximum flexibility with what occurs on the server, and how the content is displayed to the users. The open web allows publishers to change the software that produces their website while preserving the permalinks.

With a CMS-hosted solution, authors can move their content from one CMS-hosted solution to another, and that generally means switching CMS providers.

While I prefer the server-hosted solution, most authors don’t want to be sys admins, which makes a CMS-hosted solution better, since the CMS-hosting company takes care of managing the operating system and the web publishing software for security updates and bug fixes.

The open web gives authors the option to use their favorite text editors to create static HTML pages, and they can markup their web pages anyway they desire. No restrictions. Authors can create and update their pages locally on their laptops, tablets, etc. and upload them to their web servers. Or they can log onto their servers to create and update their web pages.

On the open, authors rarely encounter a wall that cannot be knocked down. But it’s entirely possible that the easier the publishing system is to use and manage, the more restrictive it can be for authors. Maximum flexibility might mean more tech skills will be required to manage content and the software.

I read blogs that do not use a domain name, except for what Blogger provides, which would be yoursitename.blogspot.com. Is that the open web or a silo?

People who use Medium, WordPress, Tumblr, and Blogger without domain name mapping have carved out their own web spaces on the internet. That feels somewhat like the open web to me.

But those users have little to no access to modify the source code that creates their websites. Authors can make theme changes, but mostly, those users are walled in to what those services provide by default. And that may be good enough for those publishers.

  • tweet by Indieweb developer:

Funny how “the open web” is now used to refer to closed silos like Blogger, Twitter, and Medium

It’s comical that the above opinion is a Twitter post, but that Indieweb user stores his social media posts on his own website, which is how the Indieweb works. Any content posted elsewhere is also posted to the publishers’ personal websites. The Indieweb mantra is post on your own site first and then syndicate elsewhere.

Anyway, if someone creates a blog at WordPress.com, which means using the WordPress or Automattic servers to host the content, then the publisher won’t need to create a server hosting account with Digital Ocean, Dreamhost, Amazon Web Services, etc, which means the publisher won’t need to download, install, configure, and manage the web publishing software. All of the server hosting and software management are conducted by WordPress.

The publisher is free to focus on a website design theme and creating content. The publisher’s domain name would be yoursite.wordpress.com. Is that the open web or a silo? WordPress supports domain name mapping, which means publishers can buy domain names and point the names to their WordPress accounts, creating yoursite.com. Is the latter more open web than the former even though all hosting is the same?

This is what’s confusing. In the above scenario, the setup meets most of my requirements of the open web. The author with the above setup cannnot make significant overhauls to the server-side source code. But most publishers don’t care about changing the CMS source code. Besides, authors can install Wordpress plug-ins that satisfy their requirements.

When small publishers such as The Awl moved their content from their own server hosting accounts to Medium, did those publishers move from the open web to a silo even though their domain names remain unchanged? Confusing. At least when The Awl hosted on their own servers, they could change the CMS source code, and they could create a unique look on the front end. But many publishers want to focus on writing and not on programming, desisning, and sys-admin functions.

If I host at WordPress.com, Blogger.com, Medium.com, Tumblr.com, DigitalOcean.com, or Amazon.com/aws, how are some hosted solutions considered a silo while others are considered the open web? Again, self-hosting at DO or AWS enables full control over how the site is built and how it looks. Maybe degrees of open webness exist. That would make things even more confusing.

In my opinion, public websites that are meant to be read by browsing-only users and are built as a JavaScript single page applications (SPA) do not support the open web. That’s an easy one, regardless of where the site is hosted. When JavaScript is disabled, these sites display no article content.

When SPA pages are curled with a command prompt utility, the downloaded content consists mainly of JavaScript code or a link to the JavaScript that will fetch the content and display it dynamically in a web browser that can execute JavaScript. SPA pages fail in the text-based Lynx web browser. When a web page is considered not curlable, it means that the page contains no content.

2015 tantek.com post: js;dr = JavaScript required; Didn’t Read

Pages that are empty without JS: dead to history (archive-org), unreliable for search results (despite any search engine claims of JS support, check it yourself), and thus ignorable. No need to waste time reading or responding. Also known as, if it’s not curlable, it’s not on the web.

Excerpt from a 2015 Hacker News thread about curl:

“If it doesn’t load through curl, it’s broken.” –someone

… requiring code execution in order to read data [text] is madness. I wasn’t saying not to do the fancy stuff but rather to start with something which degrades well and then have your JavaScript enhance that basic experience.

That last sentence touches on the concept of Progressive Enhancement.

If the SPA or JavaScript-heavy website, however, is a “web app” that requires a user to login to perform functions, such as using email, preparing tax returns, building a market research survey, banking, shoppping, etc., then I can understand the use of JavaScript, and I have no problem with it. But if JavaScript is required simply to read a website, then that’s anti-open web.

If an app other than a web browser is required to be downloaded and installed simply to read a website, then that’s also anti-open web. It doesn’t matter if the app uses web technology, such as REST, JSON, http/https.

But users may prefer apps over the open web, even when acting as browsingly-only users. Designers and developers respond to user preferences.

An “old” 2010 Wired article with the sensationalistic title The Web is Dead. Long Live the Internet

As much as we love the open, unfettered Web, we’re abandoning it for simpler, sleeker services that just work. You’ve spent the day on the Internet — but not on the Web. Over the past few years, one of the most important shifts in the digital world has been the move from the wide-open Web to semiclosed platforms that use the Internet for transport but not the browser for display.

It’s driven primarily by the rise of the iPhone model of mobile computing, and it’s a world Google can’t crawl, one where HTML doesn’t rule. And it’s the world that consumers are increasingly choosing, not because they’re rejecting the idea of the Web but because these dedicated platforms often just work better or fit better into their lives (the screen comes to them, they don’t have to go to the screen).

I store most of my images at Flickr to be embedded into my web posts. That’s using a silo to support my open web sites. If Flickr disappeared without warning, then my pages would contain missing images. But I backup my images to DVDs, or I used to, which permits me to retrieve the images to be uploaded to either my own site or to another image hosting service.

I also store some of my images on a server that I pay to use, and I use my own image uploading web app: http://waxwing.soupmode.com. My Waxwing JavaScript code works within the browser to lower the quality and size of the images, which reduces the upload time and reduces the amount of space consumed on the server. My code also provides an upload option for WiFi connections that uploads the entire, original image, and then my server code creates two or three smaller versions of that image. I would still need to backup the original image to some kind of cold storage, such as a DVD.

I use servers that are hosted at Hurricane Electric, Digital Ocean, and Amazon. I’m not using a server in my own home, except for my Tor site http://zwdqwr2p2xwkpbyv.onion.

When I host at DO or AWS, is that supporting or opposing the open web? I could lose my websites if for some reason the server hosting company deleted my accounts, went out of business, or my credit cart expired.

When a Facebook page is only accessible when logged into Facebook, then that’s a silo or closed web. It’s odd that some businesses set their Facebook pages to a private setting that excludes people like me who do not have an active Facebook account.

But if the business’s Facebook page is available to everyone, is that the open web or still a silo? Based upon my requirements above, it’s a silo. The business owner has little to no access to change how the site looks and functions.

The publisher chose to host content at Facebook, instead of at WordPress or Medium. Maybe the publisher is unconcerned about owning a domain name. Maybe the publisher does not want to install and manage software on a server. And small business owners have more important things to be concerned about. Unless some guidelines are violated, the Facebook page will probably last as long as the business.

Quote by someone:

“If you aren’t paying for something, then you aren’t the customer. You’re the product being sold.”

One aspect of the open web could be that the author’s content and actions are not re-purposed by the hosting service and used for targeted advertising. That eliminates Facebook from open web contention.

Some people think that in the future, independent websites won’t exist. People, businesses, and orgs will publish on platforms, like Medium.com, Facebook, Instagram, and Snapchat. It’s already that way for some small, locally-owned businesses in the Toledo area, which do not have a website with their own domain name.

An easy example of publishing to a silo is Facebook’s Instant Articles, which only functions within Facebook’s native app for the phone. I don’t think that IA works in the company’s native app for tablets. It’s phone-app-only. Obviously, IA does not work for the web on any device.

Facebook’s Instant Articles relies on a Facebook-flavored RSS file. RSS is an open web example, but the default RSS feed of websites won’t work. CMSs need updated to produce a different RSS file for IA.

Instant Articles is an interesting publishing mechanism, since updates on the publishers’ websites get accepted by IA. But again, it’s available on Facebook’s native phone app only.

Phone-app-only is not the open web. I don’t want to install nor use native apps on the phone except for web and gopher browsers. A theory: The Open Web is the Answer to App Censorship.

The debate will continue, regarding the open web and silos. Most users, publishers, businesses, and other orgs don’t care about such trivia. They’re too busy with other facets of life. They want to use the easiest tools available that provide the most reach to their readers, fans, customers, and contributors.

And if the preferred publishing and communicating platform for these users is Faceook, then that is NOT Facebook’s fault. It’s the fault of the open web geeks for failing to create easier systems.

The Blogosphere of the late 1990s to the mid-aughts was a holistic, organic, decentralized social network of sorts that relied on personal websites, comments, RSS, pingbacks, trackbacks, and the blog search engine Technorati.

Facebook launched in February 2004, but it was limited to specific email addresses or users until September 2006 when Facebook opened up to everyone. Facebook to the world is only about 11-years-old. The internet began in the late 1960s or early 1970s. The web began around 1990. The web became popular to the masses by the mid 1990s.

The open web fans had a 10- to 15-year headstart on Facebook. And now 10 more years have passed.

Automattic/WordPress permits users to download, install, and customize their software. That supports the open web. WordPress is the most popular CMS. Numerous other CMS, blog, and wiki systems exist that can be freely downloaded by publishers.

People who have built open source products have contributed to the open web. Examples include publishing systems, web servers, database servers, caching servers, programming languages, and frameworks.

I think that the message boards, such as Hacker News and the Stack Exchange sites contribute to the open and closed web, since discussions help inspire people to build things, and meaty discussions can alter the opinions of makers.

In my opinion, the Indiweb people have done the most to support the open web. The Indieweb attracts new converts every year. Currently, the advanced Indieweb ideas are probably too technical for mass adoption. But the Indieweb is taking a long, slow approach to change.

We don’t know what the next big thing will be. We don’t know for certain how people will be creating, sharing, and responding to content five to ten years from now. Hopefully, some of the Indieweb ideas will creep into the mainstream.