So, let's look at an example in the real world: The Guardian has already implemented AMP, which you can easily test by appending "/amp" to any guardian.co.uk uri. I'll take a deeper look at this one: More plastic than fish in the sea by 2050, says Ellen MacArthur.
The result is truly amazing: the page consists of 107 elements and loads even slower than its non-"amp"-lified counterpart. Here's how it looks in FF's inspector (the screenshot is incomplete). Wow!
Of course it would be really easy to make the web as fast as in "the old days": just leave out all the bloat. To show how this works, I re-made the above Guardian page: here's my old school amplified version. This version consists of only 5 elements (the HTML and 4 images), I left out the headlines on the bottom and the advertisements though. It renders way faster than the AMP version and is readable on any device (PC or mobile doesn't matter).
So, what is this AMP thing for?
In reality it just hooks up "publishers" (that is: companies making money by creating webpages containing ads which try to lure in viewers by also containing some "journalistic" content) so their content gets delivered through Google's networks. Google then knows more about the reading habits of people even if the advertisements on those "amplified" pages are not served by Google itself. And - of course - it makes it possible to further the surveillance of billions people all over the planet. I bet it'll take not that long til the first law enforcement officials demand access to Google's AMP logs.
And along the way this initiative is yet another blow against the open web.