We already knew Intel’s follow up to the Gemini Lake line of cheap, low-power Celeron and Pentium processors would probably be code-named Elkhart Lake and that the new chips could pack way more performance, possibly by combining high-performance “Sunny Cove” CPU cores, lower-power “Tremont” cores, and Intel Gen11 graphics.
Now, if a leaked product roadmap from electronics company Mitac is accurate, we also know one more thing — we’ll have to wait a little while for those chips to hit the market. They’re not expected to launch until the first quarter of 2020.
Here’s a roundup of recent tech news from around the web.
- Intel’s Elkhart Lake chips likely coming in 2020 [FanlessTech]
Intel Core i-series “Comet Lake” desktop chips are slated for a Q2, 2020 release, according to the same roadmap.
- Google accidentally leaks Nest Hub Max [Android Police]
Google may have an awkwardly-named 10 inch smart display on the way. Mentions of the new Google Assistant-powered device showed up briefly on the company’s website.
- Valve Teases ‘Index’ VR Headset For May [UploadVR]
Valve’s VR headset is coming in May, and although we have a rough idea of what it looks like now, that’s about all we know for certain at this point.
- Magisk v19 beta released [xda-developers]
More Magisk v19 beta brings support for Android Q beta, native 64-bit support, a new MagiskHide implementation (to hide root access), and other changes… but it doesn’t yet support Google’s Pixel 3 series smartphones.
- Anker Preps Nebula Gaming Controller [ZatzNotFunny]
It looks like the game controller would work with the company’s Android-powered Nebula projectors… but you could also use them as Bluetooth controllers for other Android devices including phones and tablets.
You can keep up on the latest headlines by following Liliputing on Twitter and Facebook.
Subscribe to Liliputing via Email
Join 16,211 other subscribers
It’s laughable to me when I read about graphic improvements to Intel chips. For real, how many time and for how long have they been bringing out this nugget? It’s stale. And yes, by all accounts, Intel graphics suck regardless of how long they’ve had and how many promises they made on the subject. As they say, tell me another one.
Agreed. Intel has had the position to pretty much eliminate Nvidia’s low end graphics card market (e.g. mx 150 and the likes) but have not bothered. After years, their built-in gpu is still only capable of running much older games. They are still the equivalent of 8-10 years old midrange descrete gpus.
I will view this situation as contrived. Intentional. Intel has this habit. I think it’s called “partnership”. They won’t be all they could be because it may affect partners and they can’t/won’t/don’t step on the toes of others. Afterall, it’s all about money. Intel is choosing to suck. I fine if someone can argue that Intel doesn’t have the brains, money and development abilities to release a capable graphics solution.
That’s not how it works (except in areas where anti-trust is a serious concern, but the US government has been AWOL for years on that score, so it’s not). Intel has chosen to focus their time and efforts elsewhere, that’s all, though that’s now changed, and they’re throwing a lot of resources into catching up.
Intel GPUs have been “good enough” for years, for everything outside of gaming or specialized graphics work. Even on my desktop computer, with its Radeon RX 480 GPU, I rarely do anything that the embedded Intel HD Graphics 530 GPU can’t handle, even at 4k.
Intel’s volume sales have been driven by servers and the corporate market, where graphics requirements are secondary to non-existent. They’re getting interested now because GPU technology is proving to be useful in new fields like deep learning and other big data applications, not to mention the potential for server farms filled with GPUs for cloud based applications.
It’s going to take them a while to catch up, but they’ve already recruited some of the biggest names in graphics technology development, and they have the financial muscle to do it.
I don’t care about what they name it.
I don’t care what the advertisement says about it’s improvement.
I don’t even care about it’s price (can’t go any higher).
All I care about, are verified and trusted third-party testing to KNOW how much the improvements are in real-world conditions. There has been negligible improvements in their 14nm 2c/4t U-chipsets from 2016-2019in terms of both battery life and performance.
In the technology field, that’s an eternity. It’s even longer on the desktop chipset front when looking at the Core i7-5960x and Core i7-2600k.
Tack on top of this that their graphics performance has stagnated for the last five years. Even if they may even double the performance of what they currently have, the fact that they have not moved the needle for so long is the tell-all. In reality, they are trying to make up for years of what should have been incremental performance improvements.
Intel is guilty of dragging their feet somewhat, which is why it took AMD to push them into using quad-core processors on their i3 processors, but some of it has been down to their 10mm process woes, which was hardly planned.
But on the other side of the coin, the simple fact is that any laptop or desktop with an SSD and 8GB of RAM from the last six or seven years is more than fast enough to do 90% of types of tasks people use computers for. The T430 Thinkpad I’m using to type this comment is almost seven years old, and after upgrading from to an SSD and 8GB RAM, it can do everything I want it to do other than play games. It would be nice to have a new computer, but performance-wise I still can’t justify the cost.
Is Baytrail still around?
Comments are closed.