Microsoft Edge Experimenting with a Super Duper Secure Mode

(microsoftedge.github.io)

140 points | by todsacerdoti 989 days ago

12 comments

  • ho_schi 988 days ago
    I dare it - turn off JavaScript!?

    Websites doesn't require JavaScript, what really needs JavaScript are Singe-Page-Web-Apps-Somethings with anti-patterns like infinite scrolling. You get two things in return, a "super duper" fast web and a more secure web-browser. For example Amazon supports usage without JavaScript very well. Another experience is Stackoverflow, things like the preview and highlighting doesn't work. The highlighting can be added with server-side code but this will cost some CPU-Time - and it is not your CPU-Time. It is their CPU-Time? There is tiny feature I would appreciate in HTML-Engines "copy link to location" instead of using JavaScript - but there is this usable hierarchical address-bar a the top (which Google tries to hide) which already serves this purpose.

    I'm not against JavaScript. JavaScript is just a tool but I ask the question if good websites require it? Hackernews uses 134 lines of JavaScript, this is alredy nearly nothing. Can you imagine using Hackernews without JavaScript?

    My webbrowser (WebKitGtk) provides a permission panel for every website:

    * advertisements

    * notifications

    * password

    * location

    * microphone

    * webcam

    * media

    A first step would be adding there JavaScript, too? Maybe some nasty Cookie-Dialogs will disappear as consequence. But that is not a loss?

    • rakoo 988 days ago
      I'm using uBlock Origin more or less in nightmare mode (https://github.com/gorhill/uBlock/wiki/Blocking-mode:-nightm...): all 3rd party stuff blocked, inline and 1st party scripts blocked. Some websites work out of the box and I love them, but too many still want me to enable 1st party scripts and 3rd party resources to at least display correctly even though most of the content I'm consuming is text.

      This is also one of the reason I'm hoping more people write in Gemini, because it's just text

      > Can you imagine using Hackernews without JavaScript?

      It definitely works, it's just a very few small features that need javascript. The most important for me is the ability to collapse sub-threads, which is mandatory when reading threads with lots of comments.

      • erk__ 988 days ago
        HackerNews would not really work in gemeni though as there are too strict limitation on how long urls can be and the only way to send data is by appending it to the url.
        • rakoo 988 days ago
          I agree, but HN isn't really a "website", it's closer to an application: you don't really have documents. The ideal interface to have HN functionality is Usenet, or maybe mailing lists: you have many people exchanging messages about a specific subjects, quoting and replying to each other, all in very hierarchichal threads. It's only because HTTP is so widespread that HN uses that, but if it weren't I would bet we wouldn't even be talking about javascript.
    • axiosgunnar 988 days ago
      How is infinite scroll an antipattern?

      It‘s simply like having an automatic transmission in an ICE car.

      The motor type is an implementation detail the user does not want to care about.

      I want to press the pedal and the car accelerates. What happens in the background does not concern me.

      Same for websites.

      If I see a list that is larger than the height of my screen, I want to be able to keep scrolling until I decide to stop.

      I do not care that the creators of the website have structured their database so that the current DB would return 573738 entries.

      I also do not care that the way the developers decided to present the application to me (via a browser) means that every row entry is a HTML element and showing 573738 HTML nodes at once would slow down the browser.

      I just want to keep scrolling (and will stop after a few hundred rows anyways).

      Having these kind of ivory tower fringe opinions like „infinite scroll bad“ is gate-keeping by middle-aged backend devs that panic at the speed and impact of the frontend world, plain and simple.

      • 411111111111111 988 days ago
        I wouldn't call it an antipattern either - it literally does exactly what the programmer wants, but it's definitely a dark pattern.

        Most of us humans have an urge to finish things, but infinite scroll by definition never finishes. This increases engagement of users by utilizing their psychology, which is basically the definition of a dark pattern.

        There is an argument to be made that it also increases usability, so it's inception probably wasn't malicious. But it's definitely another product of today's addiction driven design philosophy if you just look at the psychological effect it has on users

        • axiosgunnar 988 days ago
          Infinite scroll is a tool.

          Your problem is with pointless dark-pattern ridden attention-grabbing social media using that tool (and rightfully so!).

          Now imagine a useful application, like an email program.

          Infinite scroll is useful there.

          Imagine having a paginated inbox!

          • Uristqwerty 988 days ago
            Imagine an email program that, knowing how many total items there are in the inbox, gives you finite scroll! Even if it only streams metadata for nearby items, so if you suddenly drag the bar it has to wait for a network round-trip to show content, one of the problems with infinite scroll is that the scrollbar no longer shows a reliable absolute position.

            A better case for infinite scrolling would be chat history, where you'd want an entirely separate UI widget for "jump to date", with bidirectional infinite scroll that unloads distant content, and a way to grab permalinks. Missing any one of those features, or the fact that messages will never be re-ordered for marketing purposes, infinite scrolling becomes a hinderance.

          • snomad 988 days ago
            I prefer a paginated inbox, its that way in gmail for me (maybe I turned on some setting years ago?).

            The pagination is a nice break and lets me know I am done, everything on back pages has been previously processed.

          • WorldMaker 988 days ago
            A paginated inbox can be extremely useful. Pagination gives you a "You are here" context: you are on page 3 of 20. That context is great in use cases such as "I need to review everything in this Inbox". If you just finished 3 of 20 in that case you know you are roughly 15% done. If for some reason you are interrupted and need to close the app and come back you know that you can probably start right where you left at on page 4 of 20.

            Finite scroll is a compromise that you get some amount "you are here" context from scrollbar thumb size and position. If you are interrupted you can generally visually guess about where you left off, maybe.

            Infinite scroll is terrible tool for an inbox because you no longer have such context. Scroll too far and the scrollbar thumb changes size because new items came into view. Most infinite scrolls entirely hide the scrollbar thumb because it is precisely so useless. Try to resume where you were in your inbox after an interruption and you have no idea how far to scroll down, and you can't just use the scrollbar thumb and hit an approximate spot. You have no context for where you are, you are lost in an infinitely scrolling maze of items all alike.

            Even if infinite scroll wasn't a dark pattern (and it is, there's enough behavioral psychology studies now that have gathered enough evidence that infinite scroll feeds addictive behaviors), it is a terrible tool to work with if you are trying to get stuff done. It has none of the context of "you are here" and "here's how you can come back to where you were if you need to leave". Both of those things contribute to why it is such a dark pattern. Both of those reasons are strong reasons why it is an anti-pattern anywhere you expect people to work/to get stuff done. Pagination is great for getting work done and infinite scroll is awful. Finite scroll is alright compromise between the two for some apps.

          • ptx 988 days ago
            Imagine an email program that isn't built with web technology. It could actually display all the items in your mailbox, with a normal and usable scroll bar, without "infinite scroll". I distinctly remember applications being able to display lists in the before-times.
          • ho_schi 988 days ago
            Dark pattern would probably a better naming. Infinite scroll works, you're right. For example every file-browser or mail-program, which show you the visible portion on screen and at same time scrolling through all items. On the web infinite-scroll is often used to load and reload stuff and you often cannot link to an item directly.
          • what-the-grump 988 days ago
            Wait, is there a way to make Gmail scroll? Always had an issue with pagination in Gmail.
      • amanaplanacanal 988 days ago
        It does break searching for content within a web page.
        • axiosgunnar 988 days ago
          Ok, but the content wouldn't be there (and thus searchable) either if the website used pagination, no?
          • paulmooreparks 988 days ago
            In that scenario, I search the page, and if it's not found, I go to the next page and search there. Repeat to the end.

            That algorithm breaks on infinite scroll.

      • nsonha 988 days ago
        What form of book do you prefer to read? One with pages or a scroll which leave confused at all time on what part of the thing you are on or have read?
        • axiosgunnar 988 days ago
          In fact I do prefer to read books as a large seamless HTML file rather than a paginated PDF file that has cuts (new pages) at arbitrary positions.

          Now, of course the ideal way to read a book is probably a navigation/outline sidebar on the left, and then the content of the current chapter in the main pane.

          Kind of like the Rust book: https://doc.rust-lang.org/book/

          • nsonha 988 days ago
            so you have no problem with a scrollbar that jump to the next chaper when you shift it by a couple of pixels then?

            Of course why would you have a problem with that, the only navigation one should perform while reading is change chaper, who wants to go to abitrary paragraphs.

    • bob1029 988 days ago
      I'm 100% on board with this line of thinking, but the edge case that the business keeps complaining about is being able to push async events to the client.

      How would we accomplish this sort of UX without javascript?

      • saurik 988 days ago
        I think the argument is that while a good amount of my daily web browser usage might like that feature (as I do in fact use it as an application sandbox), a vanishingly small number of the separate web sites I visit on any given day (which are all one-off pieces of content I am accessing) need that to functionality, and I can whitelist them by turning JS on for them, as they are also pretty much the same few websites every day.
      • kall 988 days ago
        I believe you can utilize partial http response streaming that never stops and there is even a way to replace html fragments. I‘m sure i‘ve seen this, but I am having no luck finding anything about it.

        Edit: here it is https://news.ycombinator.com/item?id=16319248

      • roenxi 988 days ago
        That ... is an interesting question that is not really a problem for someone who wants to browse the web without JavaScript. Which is almost the only person whos choice really matters when it comes to user security.

        Some sites I've accidentally browsed without NoScript makes me question how people access sites with JS enabled at all. There are some shockers out there.

      • ho_schi 988 days ago
        For this edge case I suggest handling JavaScript like Webcam-Access, if a website wants to use it they can and need to convince the users. So we have an agreement between both sides :)

        I foster myself some weird JavaSript to allow hardware-access. But I think business itself should not be the driving force for the technical development and society.

    • ksec 988 days ago
      I need a WYSIWYG editor, a very simple one for Forum uses. And some sort of Ajax, Pjax function.

      I think that covers 90% of my needs. If we could get that without enabling JS. Some function could move back to Server at the expense of more CPU cycle.

      Although pushing more features to HTML isn't exactly a great idea either.

    • grandpoobah 988 days ago
      How do Luddites like you still exist?
    • austincheney 988 days ago
      What I find most interesting as a JavaScript developer is that almost no open source projects will touch the stupidity of giant SPA frameworks. Seriously, if you are writing software on your own time why burn that time away digging in the trash?

      On the contrary JavaScript developers that don’t contribute to open source cannot write two lines of code without some disgustingly bloated framework and a million dependencies to do half their job for them. You can’t even get hired without completely caving into the stupidity. It feels like cult membership lighting money on fire.

      • phist_mcgee 988 days ago
        Yeah dude this overgeneralisation is not cool. I'm a JS dev, and you're insinuating that since I don't write FOSS, I am a brainless monkey who glues frameworks together. Classic elitism, and it shows your hand as a gatekeeper. Not cool.
        • austincheney 988 days ago
          Not exactly. I am insinuating you are a brainless monkey if you need a ginormous SPA framework to build a simple web page with a couple click events.

          > Classic elitism, and it shows your hand as a gatekeeper. Not cool.

          Cry me a river. I am not being elitist by suggesting you should learn to do your job. I really don't care how uncool it sounds or how many tears you shed.

          I am not sure what you or the downvote brigade want here. Do you want me to feel sorry for you?

  • nwah1 989 days ago
    This is actually really interesting. Bravo to the Edge team.

    And if I remember correctly, writeable memory pages were the main reason why iOS banned browsers like firefox from embedding their own rendering engines.

    Perhaps this kind of approach could address such concerns and enable other rendering engines.

    • habitue 989 days ago
      Well, like the real reason is that the web is a threat to the app store. So, likely we'd just see some new excuse (if they even bother)
      • omk 989 days ago
        Ironically, in the early days Apple was redirecting developers to the web to build apps for the iPhone. But then it seems they discovered a money minting model.
        • cube00 989 days ago
          Along with a vendor lock in model
      • gbrown 988 days ago
        How haven’t they been hit with antitrust action over that?
        • JetSpiegel 975 days ago
          The antitrust laws are a dead letter.

          Facebook buying two of their competitors resulting in crickets.

  • fwsgonzo 988 days ago
    This reminded me of my experience handling the frontend side of Varnish using a RISC-V emulator. The most important thing is that the emulator is quick to bring up and quick to tear down. Almost zero syscall (10ns) and vmcall (4ns) overhead. So, if nobody is doing any heavy computation, the most important things are getting the base overheads down. For example, the RISC-V emulator could handle the full frontend pipeline side in less than 1 microsecond. That's going to be hard to beat. I wonder if the same could apply to a standard website for a simpler kind of WebAssembly emulator instead of JavaScript? And by that, I mean replacing JavaScript with WebAssembly completely, and just make all the tooling necessary to make it nice and easy.
  • npteljes 988 days ago
    The name immediately reminded me of the "Oopsie whoopsie" meme from some time ago.

    https://knowyourmeme.com/memes/oopsie-woopsie

  • beefjerkins 989 days ago
    I'm surprised at how little regressions there were in the tests they run, given they completely disabled JIT. This could be very useful as a default 'mode' for websites, with JIT able to be turned on for trusted websites if the user would like more performance.
    • rhdunn 989 days ago
      They did note that the JavaScript benchmarks were reduced by upto 58%, while noting that users won't generally notice the difference.

      I would be interested to see how this affects the performance of websites that make use of complex JavaScript for things like charting/visualization (like the D3.js demos, or online formulae graphing tools), audio waveform rendering/processing, games, and other complex uses of JavaScript (including things like vue, react, bootstrap or other JavaScript UI frameworks).

    • hawk_ 989 days ago
      I know where you're going but this can easily turn upcoming players into second class, further reinforcing big tech monopoly.
      • Retr0id 989 days ago
        Assuming "trusted websites" is a user preference, why should it matter?
        • vladvasiliu 989 days ago
          Because "defaults are forever" or something like that.

          Most people won't alter those settings, so whatever is "trusted by default" will run faster. The average user will just note that some sites are very fast, while others are very slow.

          • Vinnl 988 days ago
            Basically what you see for instant messengers on Android: phones usually come with battery savings exceptions for WhatsApp, so when people install Signal, it looks bad for not delivering messages as reliably as WhatsApp.
          • signal11 988 days ago
            Everything’s a trade off, but this one is worth trying imho.

            There are lots of things could be done to even the playing field. Eg require all browsers to come “out of the box” with with zero sites trusted.

            This would incentivise regular sites to not use heavy JS, if they knew they won’t be JITed by default.

            And by all means, if you use say Salesforce, by all means trust the site. But that tiny bit of friction is a good thing imho, analogous to running ‘chmod +x’ on Unix.

            In general, I think it’s time to say that browsers should have a more refined security model, and letting every darn site on the internet access to run code on your computer is maybe not a great idea.

          • janfoeh 988 days ago
            I don't think it's quite so dire. Remember that big sites like Facebook at least used to display warnings in the developer console, akin to "DO NOT PASTE THINGS IN HERE YOU RECEIVED FROM STRANGERS"?

            There is a sizeable subset of people who are curious and do care, and who would be eager to try that "one weird trick that speeds up <hot web property du jour> 200%" spreading through their Telegram group.

            But for the most part, non-technologically inclined people seem to have a Hindu cow-like frustration tolerance when it comes to technology. If Windows takes twelve minutes to boot and your browsers viewport has shrunken to the size of a postage stamp due to toolbars, then that's just the way it is.

            I would wager that for them, site Y running half as fast as site X matters a lot less than you think.

    • kenny11 989 days ago
      Or maybe instead of trusted websites they'll move to a model where you need digital signatures on your JavaScript code to enable high-performance mode, just like you need to code sign Windows applications to avoid scary warnings about what they might do to your computer.
  • jl2718 988 days ago
    So… JIT doesn’t actually improve performance. Nor does virtual DOM. What next? It seems like web browsers and frameworks are full of this cargo cult magic that doesn’t actually work, but must be used because everything else is built on top of it. When can we call this interpreted DOM experiment over and go back to compiled programs on a window API? It seems we’re headed there anyway with WebASM and WebGL, but every program has to carry its own UI services, kind of like video games in DOS, which is also right about where performance is today too.
    • ToFab123 988 days ago
      I have enabled this setting yesterday and I am unable to tell any difference when browsing my normal websites.

      I wonder why JIT was put on in the first place. I mean if it has little to none end-user impact when removing it. JIT sounds like a great deal of doing nothing except for creating 50% of all bugs.

      It this a case of JIT was meaningful when first introduced but overtime advances in other area has made this obsolete/redundant?

  • heisenbit 988 days ago
    Web pages where compilation costs and benefits are short may be one thing. But what about cached code? And what about real applications running in the browser?
    • nicce 988 days ago
      We have come a long road to establish transparent protocols to run stuff on our browsers (HTML, JavaScript, CSS, JSON, HTTP protocol), yet we are going to towards compiled binary code, towards black-box schema.

      The users will have less power of the content on their browsers with compiled code and something like ad blockers become challenging to implement again. I'm not really big fan of this trend. For example Google Docs is being rewritten to use canvas and who knows what it actually does behind the scenes.

      • pjmlp 988 days ago
        Because the Web should never had turned into a general purpose VM, HTML 4 was more than enough for interactive documents.

        For something we do have networking protocols.

    • pjmlp 988 days ago
      Back to native applications with standard networking protocols.
  • mwcampbell 988 days ago
    I wonder if this change will make it much harder to exploit sandbox escape vulnerabilities, which IIUC are in the main browser process, not the renderer process. What is the impact of those vulnerabilities compared to ones in the renderer process?
  • thrill 988 days ago
    "our tongue-in-cheek name will likely need to change to something more professional"

    I know, I know! "Super Duper Secure Mode For Professional Enterprise Datacenter Unlimited Seats"

  • high_byte 989 days ago
    No change in performance? I'm in, but I don't see any chrome://settings or chrome://flags for this? neither Brave, Chrome nor Edge
    • arthurfm 988 days ago
      In Edge Canary, Dev or Beta, go to:

          edge://flags/#edge-enable-super-duper-secure-mode
  • tester34 988 days ago
    How about going further

    Disabling CPU's speculative execution on untrusted code (browser) too

    • rocqua 988 days ago
      If your javascript is no longer getting compiled down to bytecode any kind of speculative execution is going to be soooooooo much harder to do that I think disabling the JIT is probably the most effective spectre-like mitigation a browser could do.
  • tinus_hn 989 days ago
    Interesting but I am not a fan of how they don’t mention disabling features like WebAssembly in the short description.
    • asiachick 989 days ago
      they did mention it in the long description. It's off for now but they are planning on turning it on.

      I'd guess it's safer than JIT because the translation to assembly is simple, or can be simple. It's not trying to do the complicated process of analyzing a dynamically typed language and applying different ways of optimizing.

      • csande17 988 days ago
        I bet they'll end up using a pure bytecode interpreter for Web Assembly as well, that just runs the operations one by one instead of converting them to machine code. It'd be the same "slower but safer" trade-off that the mode uses for regular Javascript.
      • tinus_hn 989 days ago
        I’m sure there’s a great reason but the policy description reads like ‘turn this on for more security and no important downsides’

        > Disables the JIT and enables new security mitigations to provide a more secure browsing experience - Windows