![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://programming.dev/pictrs/image/8140dda6-9512-4297-ac17-d303638c90a6.png)
Probably not. Electron is popular not just for its cross-platform support, but also that its skills are highly transferable from existing web dev.
Probably not. Electron is popular not just for its cross-platform support, but also that its skills are highly transferable from existing web dev.
Instruction decoding takes space and power. If there are fewer, smaller transistors dedicated to the task it will take less space and power.
Well, not exactly. You have to remove instructions at some point. That’s what Intel’s x86-S is supposed to be. You lose some backwards compatibility but they’re chosen to have the least impact on most users.
I also haven’t wanted an Intel processor in a while . They used to be best in class for laptops prior to the M1, but they’re basically last now behind Apple, AMD, Qualcomm. They might win in a few specific benchmarks that matter very little to people, and are still the default option in most gaming laptops. For desktop use the Ryzen family is much more compelling. For servers they still seem to have an advantage but it’s also an industry which requires longer term contracts that Intel has the infrastructure for more so than it’s competitors, but ARM is also gaining ground there with exceptional performance per watt.
Exactly. Adding a third should be much simpler than a second.
As a fellow risc-v supporter, I think the rise of arm is going to help risc-v software support and eventually adoption. They’re not compatible, but right now developers everywhere are working to ensure their applications are portable and not tied to x86. I imagine too that when it comes to emulation, emulating arm is going to be a lot easier than x86, possibly even statically recompilable.
I’m both surprised and not surprised that ever since the M1, Intel seems to just be doing nothing in the consumer space. Certainly losing their contract with Apple was a blow to their sales, and with AMD doing pretty well these days, ARM slowly taking over the server space where backwards compatibility isn’t as significant, and now Qualcomm coming to eat the windows market, Intel just seems like a dying beast. Unless they do something magical, who will want an Intel processor in 5 years?
All else being equal, a complex decoding pipeline does reduce the efficiency of a processor. It’s likely not the most important aspect, but eventually there will be a point where it does become an issue once larger efficiency problems are addressed.
We stuck to x86 forever because backwards compatibility and because nobody had anything better. Now manufacturers do have something better, and it’s fast enough that emulation is good enough for backwards compatibility.
I think it is this way because Apple thought it would be misleading if the option was “deny tracking”, because there isn’t a specific technical mechanism to ensure that. It’s unfortunate but I’d rather it was honest than lied.
If it ends up being ruled that training an LLM is fair use so long as the LLM doesn’t reproduce the works it is trained on verbatim, then licensing becomes irrelevant.
Western governments need to step up their subsidies for green tech then to compete, I guess. Not start banning the people who are providing the solution.
Yeah no such catastrophic celestial events are likely in the next few millennia, and we’re pretty good at predicting those things now. The impact of climate change is already affecting a billion or more people right now.
I’m fully aware that EVs won’t solve the climate crisis. And, of course leaders in the west, especially the US, pitch consumerism as the solution to climate change. Unfortunately for many people, myself included, we have no option but to to drive as public transit has been purposefully dismantled, and opting for EVs (when already buying a car) is one of the only real choices that has any noticeable climate impact.
Alternative plan: we all are stuck on this rock together and maybe we should prioritize maintaining its habitability over bickering about who is allowed to provide the solution.
Western governments: We need to take climate change seriously and transition to renewables and EVs.
Also western governments: It’s bad that China has ramped up production on renewable energy sources and EVs, hit them with tariffs to protect our insufficient domestic production.
As someone who primarily uses Unix-like systems and develops cross platform software, having windows as a weird outlier is probably best for the long term. Windows is weird and dumb but it forces us to consider platform differences more explicitly. In the future if a new operating system becomes popular, all the checks that were implemented for windows will make it a bit easier to port to newer systems.
If something like that were to work, a lot of effort would need to be put into minimizing the UI friction. I could see something like: uploaders add topic tags to their videos, and an AI runs in the background to generate and apply new tags based on the content (most people would not understand how to properly tag content). An AI would also be used to create a graph of related tags, where similar or closely related tags are nodes joined by an edge. Then, on first login the user is prompted to pick some tags to start with. Over time, the client uses the adjacent tag graph to fine-tune users’ tags, on device. The idea here is that we could get a decent algorithm that can recommend new stuff based on what the user watches, but keep that data processing of user-specific content local. Then, the client would also have an option the user could enable that would contribute their client’s tag information back to the global tag graph, improving the global tag graph for everybody. This data could also be combined with other users data at the instance level to somewhat anonymize the data, assuming it is a large multi-user instance. If you were to host a single user instance, you’d probably not want to contribute to the global tag graph unless you’re ok with your tag preferences being public.
It’s a bit tricky but I think a privacy preserving algorithm is possible. Simply put, the more data available, the better an algorithm can be.
This is how I would describe my experience. Sometimes it’s crunch time and most of the time it’s fuck around time. After crunch time I always throw a tantrum about how if we only bothered with planning we could largely avoid it.