So... Firefox has made multi-processes mandatory

Discussion in 'Tech Discussion' started by Ai chan, Jul 11, 2019.

  1. Wujigege

    Wujigege *Christian*SIMP*Comedian

    Joined:
    Oct 6, 2016
    Messages:
    16,265
    Likes Received:
    15,756
    Reading List:
    Link
    Just use bookmarks :facepalm:
     
    Kaminomikan likes this.
  2. M4rcosR3is

    M4rcosR3is Well-Known Member

    Joined:
    Feb 25, 2017
    Messages:
    358
    Likes Received:
    465
    Reading List:
    Link
    i was using bookmarks, i even had a super addon that stopped working that was perfectly to save a tab fast and easy and most important it keep a reminder and timeset so i could track everything for later reading without much problem, but as i said it stopped working!.
    so after that i keep adding bookmark and created folders after all if the addon could do it i could do too! and keep doing it, just like @Ai chan said in previous msg > earlier this year i had 9mil +bookmark, it was so much that when i tried to do a clean install of firefox after formatting the pc and import my bookmark from cloud ,sometimes the firefox crashed, so i had to purge it in source and use old backup with less amount of bookmarks ^^ and now i have 4k-5k bookmark left and just because i'm just to lazy to keep going and cleaning the rest, so no!, using bookmark is a CURSE not a solution.
    I'm a lot better with the new browser .
    and yes this is a true story, some may even think of how it fit perfectly from what was said before but god,and i could be writing just for the lolz of it, but o god is true, DO NOT USE BOOKMARK IF YOU ARE SOMEWHAT LAZY AND HAVE PROBLEMS WITH DEPRESSION IT YOU EAT YOU WHOLE!
     
    Wujigege likes this.
  3. Wujigege

    Wujigege *Christian*SIMP*Comedian

    Joined:
    Oct 6, 2016
    Messages:
    16,265
    Likes Received:
    15,756
    Reading List:
    Link
    Then you have a problem: you hoard
    I was sad when my firefox sync failed because the developer stopped supporting it and I lost all my bookmarks
    It was one of the reasons I moved to Chrome but in hindsight it was liberating
     
  4. lnv

    lnv ✪ Well-Known Hypocrite

    Joined:
    Jan 24, 2017
    Messages:
    7,702
    Likes Received:
    9,044
    Reading List:
    Link
    Ram hungry as chrome? That doesn't sound so bad...

    [​IMG]

    Chrome used to use a lot of ram back in the day but optimizations has made it one of the most ram efficient browsers.

    That said, I don't think you need to swap away from firefox. There are optimizations you can do to lower memory usage.

    There is no new version of IE, it is dead. Not to mention a lot of webpages won't work on it going forward.
     
    Wujigege likes this.
  5. TamaSaga

    TamaSaga Well-Known Member

    Joined:
    Oct 11, 2016
    Messages:
    1,726
    Likes Received:
    2,173
    Reading List:
    Link
    Because there is currently a glut of it, RAM is rather cheap these days. You can get like 32GB of it for $100.
     
    Chrono Vlad likes this.
  6. Kaminomikan

    Kaminomikan 神のみ感

    Joined:
    Feb 8, 2016
    Messages:
    983
    Likes Received:
    1,071
    Reading List:
    Link
    wow... ppl can be so...

    will recommend this addon for firefox I think it also works with chrome. Auto Tab Discard
    what it does, is when you reach a number of open tabs (you set the number), it unloads the inactive tabs (you set the time for the tabs to be taken as innactive) releasing resources. the tab is there but unloaded, until you focus the tab again.
     
    Wujigege and Ai chan like this.
  7. Chrono Vlad

    Chrono Vlad 『Banned From Drinking』

    Joined:
    May 24, 2017
    Messages:
    2,502
    Likes Received:
    4,707
    Reading List:
    Link
    I'm more mad about how Mozilla once again changed the Add-on section it totally ruined my css code for my usercontent.css
     
    TamaSaga likes this.
  8. sgrey

    sgrey Well-Known Member

    Joined:
    Jul 12, 2017
    Messages:
    1,215
    Likes Received:
    1,497
    Reading List:
    Link
    are you basically saying... it's ok not to optimize software and don't patch memory leaks because we have more ram than before?
     
  9. TamaSaga

    TamaSaga Well-Known Member

    Joined:
    Oct 11, 2016
    Messages:
    1,726
    Likes Received:
    2,173
    Reading List:
    Link
    Last I checked, Firefox is open source software. If you feel so strongly about this, go push the commits that'll keep and maintain the feature talked about in the original post. And don't give me that shit about patching memory leaks and optimizing software since that is a neverending battle that is always in the scopes irregardless of features being added or removed.

    The fact of the matter is that not everyone has a 2GB machine anymore because RAM is so cheap. a 2GB module runs for $10. 4GB will cost you $20. I feel for those people that are reluctant to upgrade, but I could get a SBC with better specs for less that $100.

    Until then, let the developers cater to the common population instead of wasting time on issues that only affect less than 1%
    https://www.statista.com/statistics...e-online-gaming-platform-steam-by-system-ram/
     
  10. sgrey

    sgrey Well-Known Member

    Joined:
    Jul 12, 2017
    Messages:
    1,215
    Likes Received:
    1,497
    Reading List:
    Link
    Like they are going to accept a pull request with the feature that was already removed ))
    and I would have to hate myself to go and try to remove memory leaks from firefox. I dropped that browser after they killed all addons again...
    In the first place, a damn browser shouldn't eat multiple gigabytes of ram
     
  11. TamaSaga

    TamaSaga Well-Known Member

    Joined:
    Oct 11, 2016
    Messages:
    1,726
    Likes Received:
    2,173
    Reading List:
    Link
    Unfortunately, web browser engines have become extremely complex so this isn't something that can easily be solved. I mean, Microsoft Edge is being ported to the chromium engine since Microsoft has given up on its own proprietary scheme.

    That said, there is a possibility of forking the Firefox engine and keeping the feature. I'm sure someone is having that idea right now.
     
  12. Wujigege

    Wujigege *Christian*SIMP*Comedian

    Joined:
    Oct 6, 2016
    Messages:
    16,265
    Likes Received:
    15,756
    Reading List:
    Link
    I had something similar auto kill if you pass a number of tabs on my brother's laptop. It was so frustrating when he was hogging all the Internet bandwidth.
    I created a slower SSID for him and a faster one for myself when he kept removing the add on.
    I am going on the faster lane.
    SCREW NET NEUTRALITY!!

    [​IMG]
     
  13. sgrey

    sgrey Well-Known Member

    Joined:
    Jul 12, 2017
    Messages:
    1,215
    Likes Received:
    1,497
    Reading List:
    Link
    Microsft gave up because they were behind for so long that it was pointless for them to start over with their own engine, get all the bugs, and the flack all over again. And people are pretty much tired of styling 3-way, Edge, Firefox/Chrome and other browsers.

    As for the removal of the ability to be in a single process, that is not that bad imo. Running multiple processes if done right can be beneficial in many ways. I don't know why the ram grows is so much, I guess memory leaks grow exponentially with the number of processes spawned.

    Regardless of how complex the browser engine is, it's really inexcusable to eat 4-8 Gb on average, and my friend quite often had 16 Gb eaten by Firefox. I guess you can excuse some of it on ever-changing web standards, new scripting features and support for all kind of fancy stuff with latest CSS.
    Just do some damn debugging. Hire a debugging guy. I still remember the time when Firefox implemented a video chat built right into the browser. Fix the damn memory leaks before you start doing crap like that and then removing it shortly after. Such a colossal waste of resources...
     
  14. NodiX

    NodiX Well-Known Member

    Joined:
    Jul 25, 2017
    Messages:
    223
    Likes Received:
    148
    Reading List:
    Link
    Firefox new Quantum is built on Mozilla's young language Rust that leverage concurrency computing to maximize potential of modern multiple cores CPU, so it's reasonable for them to make multiple processes by default. Right now, in the current state of the web, everything still run on javascript which is inefficient when it comes to multiple core processes. So today Firefox can't show its might against Chrome's v8 engine that Google keeps optimizing over the years. However, things might change later when Web Assembly becomes the norm; gaming natively on Firefox will be much more faster than gaming on Chrome (they still have Stadia, tho). You have to wait until webapps use more and more Web Assembly to see how powerful Firefox currently is.

    CSS rarely performance except when it blocks html render when browser loads a page. It's usually bloated javascript that hogging the memory. You're underestimating how average web developers being sloopy about their javascript bundles--even Qi's Webnovel, Wuxiaworld, Volarenovels, these three webnovel publisher with new fancy sites ship more than 3x javascript sizes than the recommended value. It's not the issue about browsers are getting more complex, but it's because the websites that are getting bloater and bloater over the years.
     
    Last edited: Jul 14, 2019
  15. sgrey

    sgrey Well-Known Member

    Joined:
    Jul 12, 2017
    Messages:
    1,215
    Likes Received:
    1,497
    Reading List:
    Link
    I know, after doing code reviews for years for these so-called "developers", I know exactly how they code, and how much javascript is in the pages
    [​IMG]

    CSS itself does not get all that much resource usage, but implementing support for new features all the time for new standards will introduce bugs and memory leaks, and potentially integration problems. Then you have crap like less, where some lazy asses instead of precompiling it, just deploy it as is and compile it in the browser... Like megabytes of js were not enough, not you gotta compile css too? And then you get crap like animations, image filters, image transformations, drawing algorithms that draw color transitions and other fancy stuff. All that crap need to be rendered on the page all the while js doing async REST calls to some other frameworks and in addition downloading some other js files, getting compiled multiple times because object state has changed.
    Yea, I am well aware of all the crap that's going on... and that's why I hate it all the more. A compressed copy of the installer for the shareware version of Doom takes up about 2.39MB, with 3D, levels, game engine, etc. And there are a freaking 20mb of live js objects at NUF. There is just no reason for browsers to use so many resources. Regardless of the amount of js code on the page, eating up a freaking 4 gb of ram on average? This is absurd... Even operating systems don't require this much.
     
  16. NodiX

    NodiX Well-Known Member

    Joined:
    Jul 25, 2017
    Messages:
    223
    Likes Received:
    148
    Reading List:
    Link
    It's not really those browser devs fault. The speed of single core CPU already at its limit and now CPU companies can only increase the number of cores or make CPU cheaper to increase their value. This is already the trends in recent years. And most likely still will be a trend for a few more years. New versions of browsers favor optimizing those CPUs over RAM usage by splitting into multiple processes. Most browsers will consume a lot of RAM this way, since most engine written single threaded in mind. Unless they rewrote a new engine from scratch using new language to fully optimize for multiple core CPU--but of course, no one is willing to repeat the same tragedy of Netscape gone through in early 2000.

    I don't know much about what was going on inside Mozilla's Gecko or Google's v8 repos--I don't really know C++ so I've never checked on them--but I don't think memory leak is the main issue here. If that was the case then tech articles or blogs criticizing browser devs for being sloopy would be all over the internet, creating major drama at least once every year. Instead, when I google "why browser use so much memory", what I get articles like:
    https://www.howtogeek.com/334594/st...ur-browser-uses-lots-of-ram-its-a-good-thing/
    https://lifehacker.com/why-chrome-uses-so-much-freaking-ram-1702537477

    Yes, it's true the web medium progressing too fast for its own good. There might be bugs and memory leaks coming along new web standards being implemented but browser devs won't carelessly throwing a bunch of useless stuffs unless there are no demands for it. The web is developing at rapid pace, it's no longer home for content-centric website but also more and more javascript powered web apps are coming. More powerful internet for everybody. It creates more opportunities for business too, since client-side stacks and providers are getting matured and developing powerful web apps (in many cases) is free nowadays. This is the result of how the web, along with PCs, mobiles, and other devices, progressed all over the years. The only way to minimize the price of this crazy fast-paced development to the users is to tell your average web developers how easy it is to throw away unused code with modern build bundler like webpack or rollup, and tell them jquery and angularjs 1 are obsolete and they better learn a more leaner frameworks and libraries if they don't want to deal with javascript fundamentals and other boring stuffs.
     
    Last edited: Jul 14, 2019
  17. sgrey

    sgrey Well-Known Member

    Joined:
    Jul 12, 2017
    Messages:
    1,215
    Likes Received:
    1,497
    Reading List:
    Link
    yes, yes it is their fault :blob_grin:
    er.. what's your point? Multiple processes are very different from multiple threads and there is no need to have a multi-core processor to have multiple processes spawn. There is not even a guarantee that when a browser spawns multiple processes they will, in fact, run on multiple cores. I also never said I am against that. Having multiple processes or thread does have some overhead, but it won't be freaking exponential growth and should not take gigabytes.
    Yes, devs always mock Mozilla about memory leaks, and now chrome too. Anyone who has worked on any decently large or complicated software knows that memory leaks are a problem and everyone knows how they look. There is no need to write articles about memory leaks because it would be redundant. And those articles have really dumb arguments written by marketologist to mislead non-technical people. I don't have my ram so that it will be all eaten by firefox. I don't see articles saying "not using your cpu is wasteful, if your cpu is not loaded all the time 80-90% it's such a waste, so let's just write software that uses your cpu to the max". If you wanna check for memory leaks, it's super easy - start the new browser with 1 tab open and look in task manager how much ram it uses. Use it for a while as you normally would, then close all new tabs except for 1, and see how much memory it uses after you are done. If the size grew - it leaks memory.

    You have multiple levels of memory leaks. There are leaks in the browser itself, then there are leaks in javascript, and other plugins. It's really easy to check. This is how much loading NUF leaks, btw

    [​IMG]

    Also, don't know what's your point. Technology develops fast, yea, so? There is not a single javascript developer who thinks the way things are in javascript land is ok. Every year 20 new frameworks and buzzwords to learn. And before you realize it, you live in madness where to write a "hello world" you need to download 30 megabytes of javascript frameworks, set up a node JS server and then write 250 lines of meta code to set things up, and finally the main with "hello world" ... Yea, progress :blobconfused:
    tell them to stop using stupid frameworks like AJS without the need, that will help quite a lot.
     
  18. NodiX

    NodiX Well-Known Member

    Joined:
    Jul 25, 2017
    Messages:
    223
    Likes Received:
    148
    Reading List:
    Link
    My point is they optimized modern browsers for modern CPU. That's the trend when web already involve from static web pages into web apps where people can edit sheets and photos client side. Let's just think every tab is its own little apps, less stress that way.
    I don't really know about that. I guess I can learn something new today, huh.

    It's true dumping all the responsibility of this fast pacing changes to javascript alone isn't a good thing. That's why Web Assembly's been hyped these recent years. And a large amount of emerging frameworks isn't entirely a bad thing. Libraries and frameworks won't get a decent amount of attention if they don't do innovation and other breakthrough. And in the end, all those innovations will only serve as materials for other frameworks to adapt. There's really no need to learn all of them, checking what's going on and what they're contributing is already more than enough.

    I'd gladly download 30 megabytes of javascript frameworks locally if that mean I can ship 30kb of javascript overall bundle per page to browsers. JS Frameworks have been slimming down in recent years. Especially Svelte that has been emerging as one of the first compiler framework. Also, those frameworks have CLI and boilerplates so there's really no need to set up all of them from scratch. Adding that to bundlers making progress with tree shaking and code splitting I'd say it's better than the time when people putting script tag of jquery directly on browser.
     
  19. sgrey

    sgrey Well-Known Member

    Joined:
    Jul 12, 2017
    Messages:
    1,215
    Likes Received:
    1,497
    Reading List:
    Link
    We have a really different view on optimization. Making browser fork few child processes so it can help with some stability issues and maybe make it a bit more responsive, I would hardly call optimization. And taking into account the fact that they can't even force the processes run on multiple cores, which also might actually be detrimental... Basically, they could've done the same thing on a single-core cpu with no problems. Although I never really dug into browser engines, there are few things that can be parallelized properly, like page rendering, parsing, script compilation - those can take advantage of multiple cores, but it is quite questionable to what degree they do so.
    It is bad when you are trying to get a job and all JS-related jobs list 30+ frameworks that you absolutely have to know and some not so related technologies. It is also a big problem for people who are just getting into the dev market and are absolutely confused at what they need to learn - is angular, react, or what? And these frameworks are getting outdated by the end of the month. And worst of all, they often don't even offer anything truly useful. You could spend a few minutes and do what they let you do without the extra baggage. I also wouldn't call them "innovative" or "breakthrough" - frameworks usually don't invent stuff, just wrap and package things together. They are meant to convenience, but in the current situation became the opposite.
    Yea, unfortunately, that's not how it works usually. Your client might not need to set up node js to run angular, but you bet they will have to download angular. And jquery, and react, and whatever other crazy framework you decide to use. There is no way around it - the more JS you write, the more JS your client will have to download. That's just how it works. It's good if the framework only allows downloading the parts you need, at least that lessens the load slightly. Of course, I am not talking about the server-side js since it's a whole other story. Also, the way JS is handled is really suboptimal. People try to make it less bloated by compiling it. By its design, it has to be reloaded and recompiled multiple times. Scientists and people at the front of the CS field are trying to come up with ways around it, but there is only been partial progress so far.
    And people still put jquery tags in the browser. Just look at NUF.
     
    Last edited: Jul 15, 2019
  20. NodiX

    NodiX Well-Known Member

    Joined:
    Jul 25, 2017
    Messages:
    223
    Likes Received:
    148
    Reading List:
    Link
    Firefox only adding Rust to their engine for 3 years, mainly to optimize for web assembly (in one benchmark I saw Firefox 10x faster than chrome doing wasm). Yeah, I guess it can't really do multiple threads since the entire codebase still in C++. But I'm really looking forward all these browsers progress in the future, tho.

    Some frameworks do innovations if we're talking about their predecessors. I'd say from jquery to react and others would definitely a breakthrough. Of course, writing a web app in vanilla js is an entirely different story. If there's no need for reactivity to handle the UI then there's no need to use frameworks.

    I guess I'm an optimist since looking for a job isn't my first goal when I'm into web dev. I know what you're talking about how overwhelming dev market currently is. It's not that hard to find negativity when people talking about new stuffs in dev community.

    Have you checked on Svelte? It's really a big step forward for js framework world. Just look how small Svelte's size in this benchmark:
    https://www.freecodecamp.org/news/a...rks-with-benchmarks-2019-update-4be0d3c78075/

    This is the author's introduction a few months ago:


    If compiler frameworks like Svelte become popular in the future then I'm really looking forward for it, too.