I have a vague idea of the point of AMP, which is to speed up the delivery of web pages. However:
1) I'm not convinced the web pages were really _that_ slow to start with, so it feels like an unnecessary project (well, excepting the fact that web-page bloat has massively increased as people use more and more javascript libraries &c.).
2) Perhaps I'm being a bit old fashioned, but I don't really understand what was wrong with a browser serving a web page made out of some HTML, CSS, and Javascript. It feels like we're replacing one technology with another (and that latter one comes from Google, which makes me nervous). Do we really want AMP to be the way we serve the web?
3) I haven't properly investigated, bt I'm assuming that this delay is some script which is waiting for the data to load from the AMP CDN, and once that times out it displays the underlying content (I did check and loading this page [1] is almost instantaneous yet the content only displays after three seconds). Any insight into why it's seen as acceptable for this to be so slow when it is unnecessary for displaying the page?
[1]: https://www.ehow.co.uk/how_5840711_test-electrical-outlet-digital-multimeter.html
The problem however here is that this animation is usually triggered before the 8s, as soon as the AMP javascript has been served from the CDN. Your adblocker of something of the sort is blocking it which results in you having to wait for the hard coded 8s fallback to see the page no matter if the content is already loaded.
AMP is beautiful.
To be a valid AMP page (at least, according to the Google overlords) you _must_ use the CDN'd AMP scripts. This is the very antithesis of what the world wide web is supposed to be about. IMHO, it's not at all unreasonable to block stuff from domains you don't trust, especially domains controlled by large surveillance corporations. The web should be more resilient. AMP makes it brittle.
My thoughts exactly, this is what makes me nervous about AMP, it feels like it's pushing the web in a direction that is alien to its original ideas.
[edit]
Update: I got to a browser I could test it on. Unfortunately, it turns out uBO doesn't let you apply :style() rules to all pages indiscriminately, you have to specify a domain name pattern - which is pretty annoying, because the principle is evidently sound; this does work on the link in OP:
Hrmph.Alternatively, you could use a CSS injection addon (or userContent.css, in Firefox) to add the rule to all pages:
Or do it with a userscript (ie via Greasemonkey or similar).[edit 2]
It appears uBO is aware of a need for something like this - they provide an injectable "scriptlet" that adds the necessary CSS: https://github.com/gorhill/uBlock/wiki/Resources-Library#amp...
Unfortunately, scriptlet injection also can't be applied to every page indiscriminately. :/
So I'm just mentioning it here for completeness' sake.This is clearly a bug, since blocking AMP is not related to blocking ads (all the normal ad blocking rules should do the job just fine).
I naively assumed, to begin with, that they'd pushed out something broken. Except it never got resolved, which surprised considering the length of delay. So I resolved it by pretending they no longer exist.
I'll likely never know if they ever do resolve it.
That being said, this is most likely the culprit. If OP is blocking AMP CDN, the inlined CSS code will hide the content until the CSS animation completes after whatever the timeout is nowadays.
Lots of documents (amp or not) use the same "hide the screen until layout is done" trick to avoid multiple relayouts as the initial javascript is running and CSS is being fetched. More often than ideal, they don't have a fallback if the javascript doesn't load at all. AMP mandates it with this CSS animation, which is far better than nothing.
Also when served from the AMP Cache (https://www-ehow-co-uk.cdn.ampproject.org/c/s/www.ehow.co.uk... for the example shown), the layout algorithm that javascript runs is applied by the AMP Cache instead and the 8-second timeout code is removed completely (view source and take a look). This doesn't work on all pages - there are some features that require a client-side context, but it does work on this one. Websites can run the same server-side layout algorithm on their origin using a node library (https://www.npmjs.com/package/amp-toolbox-optimizer). There is also work being done to improve all of this (server side layout on more documents, and making the system easier to run on your own site).
The 3s observation from the original post is interesting. It may just have been an estimation of the 8s, or it could have something to do with how the document is configured. Looking at this document, there are some <amp-font> tags that the document author has added with a 3000ms timeout. These are tags that instruct the AMP javascript to change the document CSS class depending on the success of failure of loading of a particular font. By default (not amp specific), if a document loads a webfont for a particular text, the browser will not display the text until the font is loaded. <amp-font> provides a CSS hook by which the author can do stuff like "hide the text for up to 3s, or when the font has loaded, whichever comes first". This page has some <amp-font> tags with a 3s timeout referencing fonts that have not been added to the document, which seems like a mistake from the document author, unsure. I was not able to reproduce the 3s experience though, so this may be incorrect speculation as to what happened.
A mandatory 8s lag for a careful users doesn't sound graceful to me :-/
For HNers who weren't creating web back in 2009: we used to have another term as well, "progressive enhancement" that meant more or less "after we got a baseline working on all supported browsers we can add nifty stuff that doesn't work in IE.
Perhaps luck, but I have never encountered one that is so hostile about it with nearly 10s wait outside of AMP. I've rarely encountered a second or so's delay, but two or three sites including a large one that have the massively excessive AMP delay.
So to call it the same trick seems like a stretch from a user annoyance point of view.
It annoyed me as well before, though I realized as I stopped going there that this is for the best... for both my sanity and the websites bottom line.
On the other hand an ad blocking user is not necessarily producing no value for the site. I've had subs to newspaper sites, all of which I viewed with blocking on, and Independent will never now be one of them. If Guardian did likewise, and added AMP, I'd cancel my current sub and look elsewhere, which would be a definite loss of value for them.
I'm not so wedded to the views of any one outlet that I'd subscribe with this in place, or turn off blocking to subscribe, or feel I must subscribe to the same one indefinitely.
If it's a site I wouldn't have considered a sub or donation to, you're right, nothing is lost.
The content is there, it's just hidden through css. If anything, a screen reader has access to the content earlier than the unimpaired people.
I agree with the spirit of your message though, but that's not what the grandparent post claimed.
> Accessibility is the design of products, devices, services, or environments for people with disabilities.[1] The concept of accessible design and practice of accessible development ensures both "direct access" (i.e. unassisted) and "indirect access" meaning compatibility with a person's assistive technology[2] (for example, computer screen readers).
[0] https://en.wikipedia.org/wiki/Accessibility
For example, I'm autistic and use a "normal" browsers, but garish websites (for example those displaying animated ads) are less accessible to me, because those lead me to becoming overstimulated, making me less likely to absorb and remember information presented on the page as well as being physically and mentally exhausting.
I am perfectly willing to pay for content (and am, through Spotify, Netflix and Patreon), but much of the web is actively hostile to many disabled people. My physical disabilities don't prevent me from using my computer in the standard way but every time I misplace my mouse and try to navigate the web solely using keyboard (which isn't that far from how I usually use my computer, so it's not like I don't know the shortcuts) I am reminded how it must be for people who are unable to use a mouse and have to rely on other input methods. Many websites couldn't have worse UI when it comes to accessibility if they tried.
Modern browsers let the page author control this with the "font-display" CSS property [1].
[1] https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/...
That sounds awfully convenient for whoever cooked this up.
That has to be rose-tinted glasses. Networks used to be excruciatingly slow. I would walk away waiting for page loads during the dial-up era. Our expectations have simply been adjusted over time.
(But yeah, I remember the internet being super slow. JPEGs coming down scanline-by-scanline.)
Right. It's super easy to make lightning fast web pages. In fact, that's the default state of a static HTML page. It actually takes a lot _more_ work to make pages slow by adding tons of javascript, dependencies, server-side applications, etc.
From the article linked elsewhere in this thread: >The 90th percentile weight for the canonical version is 5,229kb. [...] The 90th percentile request count for the canonical version is 647
There's the problem. Over 5 megabytes and 647 requests for one article? 5MB is equivalent to about 20 full-length novels or 1,000 pages of text. And 647 requests?! That must be one extremely comprehensive article, right? No, it's 1 request for a brief article and 646 unnecessary requests wasting bandwidth to load unnecessary stuff. Then people wonder why it's slow. But someone put in a lot of work to fluff that page up to be that huge and slow and use that many requests.
I get antsy when I made a page over 100KB (excluding images, and even then anything over 1MB starts to feel excessive unless it's something particularly reliant on images). 5MB feels very high.
Fetched the same on the AMP Cache: https://www-fool-com.cdn.ampproject.org/c/s/www.fool.com/amp... 200KB cold cache. Removing the likely cached resources (amp javascript, fontawesome cdn font) and the payload is ~35KB.
It's certainly possible to make a lightweight page, as hacker news is a great example, but people aren't doing it. For whatever reason, AMP seems to be effective at making the web actually follow best practices.
Then those best practices aren't very good. In my experience, AMP is, on the whole, a detrimental thing that degrades web pages.
All I want is some way of telling sites to never give me an AMP page.
I am constantly amazed at just how bloated (and carelessly built) websites have become.
and looks fast mainly because Google preloads it: https://ferdychristant.com/amp-the-missing-controversy-3b424...
Nowadays, the landscape is way different. There are just so many "normal folk" on the internet now that content is being consumed at an alarming rate. There is so much stuff on the internet now, that even Youtube videos have become disposable. Most of the stuff that people read and watch now is consumed once and then never visited again because there's just not enough time to revisit the insane amount of content we're exposed to.
Nowadays, why does it matter if a website is made "brittle" if the content isn't going to matter in a few months anyways? And if you do want to archive something for later, shouldn't the words on the page matter more than the code behind it? After all, if a user 12 years from now wants to read your article, all they're going to want is your words and pictures. Code is always brittle because new technology makes everything obsolete.
AMP is also a worse experience than that was because in the 90s you were usually waiting on images to render and progressive display was usually possible so you could start seeing that fuzzy JPEG fairly quickly and read the rest of an article, whereas AMP by design prevents anything from displaying until it’s loaded and executed correctly so you often have to reload the page to see anything at all when it fails.
This matters because most of where AMP was marketed to are competitive fields and that means it’s training users that they’ll get what they want faster and more reliably somewhere else.
One reason is brand perception. If your website is significantly slower and/or brokener than a competitor's then eventually people will stop coming back. Presumably you want your brand to last more than a few months.
When comparing AMP pages to their associated canonical pages, the most striking difference is that AMP pages are significantly lighter (905 KB vs. 2,762 KB) and load significantly fewer assets (61 vs. 318 requests). (All numbers are median values from testing 50 pages.)
Many people have slow phones and expensive internet, and for them this difference enables them to browse news sites, literally.
See http://text.npr.org or http://lite.cnn.io for example
For example, AMP provides an ad component that is probably the by far most performant way to display ads on the web. Without AMP, the site has to use alternative ad solutions, which probably perform worse and load more JS.
The only way for the site to reduce JS would be to find an ad solution that is similar to AMP in terms of performance, and I fear such a solution just doesn’t exist.
And if you think a whitelist would work, all it takes is one look at how rapidly new JS libraries come and go to render that unfeasible
Simple ads can be served statically without JS.
(Disclosure: I work on ads at Google, but I don't know much about how our spam detection works and I couldn't talk about it if I did.)
Would have been much better to use Lighthouse / PageSpeed as a basis for showing optimized pages.
Sorry for the rant. I'm just genuinely disappointed in what the web has become.
Is it just me or has the mobile web taken a nosedive in recent years? Imgur won't even let you upload images from mobile anymore without their app, reddit is getting progressively more annoying, "Get the app" buttons and modals and banners are getting progressively more prominent and dark-pattern-y, and the default for all web pages now seems to be to include <meta name="viewport" conent="width=device-width, initial-scale=1"> regardless of whether the developers actually tested that their web page works on small screens, meaning web pages which would've just fallen back to desktop emulation mode in the past just simply don't work at all.
Disabling Javascript helps a lot, but the "request desktop site" toggle is still frequently necessary.
Imgur is, however, deserving of special mention for actively degrading their mobile experience - direct image links redirect to the mobile site now, which loads low-resolution images.
It’s unfortunate that so much work goes into UX design but at the same time the web has been going downhill for years.
I bet there are plugins for firefox mobile to remove those animations and nag window. Basically the "continue to mobile site" just close the banner since you are already on that site.
A few too many times, I have read an article, finding it a bit vague and honestly not that good, only to find that all the code blocks or some of the images were gone.
If paragraphs of text go missing, maybe you notice it instantly, but if it's images or code blocks which aren't explicitly mentioned in the body text of the article and just let exist to contextualize or demonstrate what the author is talking about, how will you know that something is missing? More importantly, is it even possible to develop a heuristic which catches almost all cases where parts are missing, without lots of false positives for cases which are just inexperienced authors writing bad articles?
It's because "desktop" web is difficult for newbies.
People like you and I love the "desktop" web because we grew up using desktop computers. We don't mind firing up our web browser, typing in a URL and poking around for tiny menu items using CSS that's optimized for big HD desktop monitors. That's what we're used to and that's what we think is normal.
Nowadays, everybody- even your grandma- is on the inernet. Everybody's using an iPhone with a tiny screen and large fonts. Opening up a web browser to browse a website is just something people no longer do.
Grandma would much rather have an app on her home screen that connects her to the world rather than a bookmark in Google Chrome.
Not everybody is a nerd like you and me. The internet has exploded in ways we never imagined, and now we're going to have to deal with everything being optimized for the average user.
If mobile web pages did just enough to make it obvious to their users that there are apps which the user can download, I'd believe you that it's just corporations being altruistic and wanting the best user experience for their users. They don't though. They intentionally break their mobile pages by removing important functionality, they have modals which reappear for every load trying to trick you by making the big orange "Continue" button take you to the app store, while having a tiny link which is easy to miss-click taking you to the page you're actually trying to visit, they put "get the app"-buttons _over_ the content, sometimes even with no close button, they try to distract you from the content you're trying to read by making their "Get the app" buttons fucking animate around.
This is not just corporations being altruistic and keeping their users' well-being at heart. This is corporations wanting to optimize for user interaction and data collection, and the best way to do that is to make their users use a dedicated app for just their content, where the user will always be reminded that they should check Imgur whenever they unlock their phones, they can send their users a notification about what's trending on /r/AskReddit if they detect their users haven't used their app for a while, they can make sure their users won't leave for a competitor as easily.
> 2) Can you stop the "Download the app" popups showing up so frequently on mobile?
spez (Steve Huffman):
> 2) They've been gone a while, but we are chasing down an issue with incognito users seeing it more often. Please let me know if that's the case, or if you are having a different experience.*
Also talking about dark patterns: https://old.reddit.com/r/announcements/comments/9ld746/you_h...
Unofficial apps make browsing 100x faster and less annoying.
I just tried fetching http://www.reddit.com/ in an incognito window (no personalization) on a fast connection and it took 1.2s to get the first byte on the connection. I can't say if this is typical, but the AMP Cache has much faster delivery (~30ms) even before considering the preloading.
https://i.reddit.com loads instantly, no JS required for 'performance'.
A mark in its favor though is local news sites. They are desperate to be at the top of the search page so when they join AMP, and you look at their AMP pages all of a sudden the loads of dark patterns they usually employ are absent.
As Google results have gotten progressively worse for common search terms (try searching anything health related), this sort of filter could bring back the web we see to where it used to be.
I'd love to see a nonGoogle search engine like this.
Website performance could be a minor tie-breaker between two similarly ranked websites that have the same information (and it probably is), but I don't see how it could be anything more than a weak signal in a search system where the objective is to find the most relevant results.
I'll wait 30 seconds if it means I'm loading the result I'm looking for. That a less relevant website loads in one millisecond is frankly immaterial to me.
I'm switching over to php because it's the simplest language that seems to have CORS headers without requiring middleware packages. It seems like AMP can't even do something as simple as a contact form without CORS headers. I'm sure an experienced programmer could solve this problem easily but it makes things harder for amateurs like me.
It also doesn't seem to work with bootstrap so I've had to switch the whole thing over to basscss.
It seems like choosing AMP limits the other technology you can use.
This doesn't make sense at all. Even if I hadn't previously built a site on django before which served AMP pages, I would be willing to bet that no pure backend web framework limits what your frontend can achieve. . Also, Things are always harder for amateurs, just keep working hard man.
https://news.ycombinator.com/item?id=18289349
Maybe an unpopular opinion, but if you're getting a slow load/blank page because you're blocking parts of the content, yeah, "it's just you". Without the fallback mechanism that gets triggered after a few second, you'd just be stuck at the white page.
A good site will load faster than AMP, on a decent connection it may even be comparable to preloaded AMP. However, news sites have been pretty universally horrible at making good web sites. If someone stood behind their web developers and management with a whip, forcing them to build decent web sites and disallowing them from adding bloat, AMP would likely be unnecessary.
Part of the benefits of AMP is the possibility to preload, which allows for near-instant page opening from the search result page (unless you're blocking content, of course), the other part of the benefits is being that whip.
My overall experience with AMP has been positive.
sadly it's the management, marketers, sales who are standing behind them and holding the whip. Most places measure engineering quality by how many features you can push through your CI/CD pipe (minus the fault-reports, breakage that the pipeline generates).
When everything works perfectly it isn't appreciated as it indicates that engineering isn't running at full capacity. The same problem we have with security and all resilience topics: how do you measure & budget (or in security monetize) something that is supposed to be invisible to the end-user?
As for AMP it's not just a tire-fire it's a clear attempt to break the open web and create lock-in for competitors.
The Huffington post is one. On an older phone or tablet, the site is unusable on iOS. It will crash, and sometimes freeze. Long articles fill in excruciatingly slow.
I would think in a data driven business, that there has to be significant numbers of people who actively avoid the site, and that they would want to minimize this.
Yet there seems to be no improvement over time.
I’ve sometimes wondered if it’s because literally every person in engineering and management at company like Huffington Post have fast devices, and they don’t notice how bad it is.
I find it paradoxical.
Multi user Hangouts is also horribly broken in chrome (first user can't see or hear anyone).
Have you tried not blocking the script?
The CSS animation is a fallback to that intended behavior.
The think is, if you are on a internet connection where you have frequent loading time of 3s for the _basic_ content, then you are used to seeing partial loaded pages and you likely love it if the connection happen to load the content you care about first, potentially leaving the page before it's completely loaded when noticing that it's not what you are looking for.
The choice of 8s looks _a lot_ like they where looking for the highest time they can chooses without making it look obvious deliberately but which is long enough to make many "causal" users stop using the script blocker.
I.e. 10+s would look way to obvious as a intentionally long choice. 9s still looks big as we are used to see 9 as nearly 10 from product prices to some degree (e.g. 9.99€). So the highest number which doesn't look obviously high is the 8. ¯\_(ツ)_/¯
It only looks malicious because you're looking for it to be malicious.
Better yet, make AMP opt-in.
It’s naive to think AMP has anything to do with improving the UX, it makes it worse in every way.
I really only care about the text on most websites, but I end up waiting for a lot of assets before I can see that. I understand why google wants to avoid having the page redraw the layout as assets arrive, that can also be annoying if done with a lot of relative formatting.
But for me, on average, the experience is worse than non-AMP sites.
EDIT: I just moved to desktop. If I block all scripts in uMatrix, OP's URL loads immediately (albeit with a full page of garbage above page content). If I block external scripts in uMatrix, I have to wait 8 seconds before anything appears.
No, there are many thousands of websites that render their content dynamically on the client - news or otherwise. Facebook, Twitter, and Reddit all make use of client-side rendering (and all provide news services).
>not deliberately hide content from users
Your browser also "deliberately hides content". It waits a few hundred milliseconds before triggering a paint, just to avoid content from jumping around. AMP pages do the same -- unless you break the mechanism of course.
Oh, come on. You and I both know that those are social media apps which act as news aggregators, and are not themselves sources of textual news articles. _fbpt was clearly talking about the latter.
nope.
This is like a car that tries to phone home every time you start it, so either you keep the phone service active and get a salesperson talking to you while driving, or you disable the phone and the car leave the immobiliser active for a minute each time you try to start the car.
As for the other half of your analogy - stop poking holes in the gas tank and it won't need active service.
This principle is also true when writing web pages; if you load an external resource, you need to check for and handle the case where that resource might not be available. Failing to load a resource can happen for many different reasons, not just someone blocking it with their client. Proper handling the error depends on what the resource is: display the content that is already available (perhaps in a less than ideal state), or if that isn't possible displaying an appropriate error message. This includes cases where the Javascript itself isn't available (for any reason).
Networks are not reliable or universally available. Quality programming understand that and does the best it can when failures happen.
All three cases are actually covered (full-script support, partial-script support, and noscript support). That's more resilient than most apps.
It is sad when the brightest minds in the world decide that the fix for "flash of unstyled content" is to show no content at all for close to 10 seconds.
Seriously: 8 seconds mandatory waiting would have been considered slow even 10 years ago and the only reason it passes now is because Google got a stranglehold on most of the web.
Edit: improve last paragraph
(Disclosure: I work at Google on making ads be AMP)
Perhaps a better option is finding ways to prevent content jumping around so much while assets are loading.
That 8s timeout is for loading the AMP JS from the CDN. You want a time limit that separates "you're on a slow connection, keep waiting" and "just give up, it's not worth it". I suspect it was set by looking at network graphs, but I don't know.
What the OP is doing, blocking JS and also ignoring <noscript>, is bizarre, and something you should expect to break sites.
> Perhaps a better option is finding ways to prevent content jumping around so much while assets are loading.
AMP does that very well, but only by taking control of the process of rendering, which requires JS.
(Disclosure: I work on making ads be AMP.)
My complaint is not the use of CDN, it is the forced delay to load a page when the CDN is not available or certain resources are blocked. This is a direct form of punishment from Google: refuse to let us track everything you read, we will make it uncomfortable for you to read anything.
Disabling the phone-home feature is nothing like poking holes in the fuel tank.
Then why the complaint about "phoning home"? Analogies are imprecise and brittle.
>This is a direct form of punishment from Google
If you show me some proof I'll be glad to believe you. Occam's razor says it's a simple fallback for unexpected results.
>Disabling the phone-home feature...
Blocking a required library? Who is straw-manning now?
Calling the phone home feature a required library is part of the malintent from Google. Most websites I interact with don’t need a Google Analytics in order to function, for example. If I block a Google Analytics they refuse to work. The simplest explanation here is that time-short programmers just copy the code presented by Google, who in turn want everything to report back to them so they can monetise third party sites.
Building your site for AMP and blindly using the 8s timeout from the Google copy and paste archive is exactly the same story. You don’t need that much time to load all the assets, that timeout is there only to punish people who block the AMP resources.
It is code required for the WebComponents to function. It's also charged with optimizations such as sandboxing of iframes, making requests synchronous, and calculating layout to reduce paints.
Clearly it's not "phoning home". The script provides clear actions, as laid out on the amp project website.
>You don’t need that much time to load all the assets
On American broadband, perhaps. Other countries do not have the same infrastructure.
>that timeout is there only to punish people who block the AMP resources.
Citation, not speculation, needed.
(Disclosure: I work on AMP ads on non-AMP pages, which I like because it means the ads are declarative)
By that time I’d assume the server/connection was down and either refreshed or moved on.
https://www.ampproject.org/docs/fundamentals/spec/amp-boiler...
Here it is pretty-printed and simplified to not have the vendor-specific stuff:
The first bit is for browsers with javascript, but where the network is poor and can't load the minimal required highly cacheable javascript file within 8 seconds. It uses CSS just in case javascript really isn't working anyway such as being disabled per domain or something. The second bit is for browsers without javascript - the page is unblocked immediately.The reason we need AMP now is that you can't otherwise preload a page in a privacy-preserving way.
Webpackaging will allow doing this without AMP though: https://github.com/WICG/webpackage/blob/master/explainer.md
(Disclosure: I work at Google on making ads be AMP.)
Don't give Google more power.
Implementing your website in AMP is basically handing the keys of the web over to Google. They're going to make it more and more ridiculous to stay AMP-compliant and you're going to be shut out if you don't play their game. Eventually they'll enable/disable features of AMP-based sites arbitrarily... similar to how they determine autoplay policy on their "approved sites list." Do you think Google should be the all-powerful being that determines what features you can have on your website? That's the future you're opting into if you implement AMP.
For webpages that do hide everything, I define my own CSS overrides which prevents the <html> and <body> element from ever being hidden, opacity less than 1, or similar. Further rules can be added to prevent <body> from being animated or having transitions.
What is wrong with just HTML, without CSS and JavaScript, or in some cases just plain text and not even HTML?
I keep reading about AMP on HN, but I have yet (professionally or personally) actually come across AMP in the wild unless I intentionally went looking for it -- or am I consuming AMP and not even realizing it?
Plus you might not be browsing the subset of AMPish websites. The only ALP links I see are when people share mainstream news links, and then it's straightforward enough to manually extract the URL.
...to make up for the slowness of all of the ads Google and others serve. It's amazing how fast pages load with no ad networks involved.
So TL;DR AMP is rarely implemented for the right reasons, and the implementations reflect this.
That's because you use umatrix and possibly ublock.
AMP is about making the web faster for the common Joe.
The real reason for AMP is to allow Google to watch everything your visitors do so they can sell your visitors eyeballs to your competitors.
Is it just me or is µMatrix making everything slower?
I wonder what would be required for a format to not have this problem, and how it can be prevented.
In many cases AMP is neglibly faster than mobile web page and sometimes slower.