Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays Intel Television Upgrades Hardware

Intel Compute Stick Updated With Cherry Trail Atom, Tested (hothardware.com) 90

MojoKid writes: The original Intel Compute Stick wasn't without issues. Last year's model featured dated 802.11n wireless connectivity and had only a single USB port, which meant using a hub and/or dongles, should you want to connect multiple peripherals to the device or boost its wireless capabilities. The new updated Intel Compute Stick, however, features Intel's newer Cherry Trail Atom platform, with 802.11ac 2x2 WiFi, and USB 3.0. There's still just 2GB of RAM in the device, along with 32GB of storage, but Windows 10 Home also now comes pre-installed. The result is a fully functional PC that won't burn up any benchmarks but offers utility for mainstream computing tasks and is even capable of streaming up to 4K video content. The little device can essentially turn any HDMI-equipped display into a basic PC.
This discussion has been archived. No new comments can be posted.

Intel Compute Stick Updated With Cherry Trail Atom, Tested

Comments Filter:
  • Poor Intel (Score:4, Insightful)

    by Anonymous Coward on Saturday January 23, 2016 @07:27AM (#51356349)

    The proliferation of ARM architecture seems to be scaring the shit out of them

    • With an almost steady $15B revenue quarter over quarter I hardly would consider diversifying to fend off a competitor "scaring the shit out of them".

      Creating this just seems like good business sense. Now when they start giving them away for free in an attempt to fend off a competitor then we can talk about being scared.

  • by gerddie ( 173963 ) on Saturday January 23, 2016 @07:56AM (#51356391)
    I wonder whether Microsoft is paying Intel, or Intel is paying Microsoft to put Windows 10 on it. IMHO Android x86 stripped free of Google spyware would have been a better option for "basic computing tasks".
    • by cerberusss ( 660701 ) on Saturday January 23, 2016 @08:03AM (#51356401) Journal

      There's definitely a deal involved. Windows is the only OS that really needs Intel. And MS has shown with the (failed) Windows RT that in a pinch, they could do without.

      • Windows IoT Core also runs on ARM, so the core code base is certainly cross platform.

        • Hence, why MS is pushing Windows 10 and Metro over win32 app development hard.

          NT from day 1 was purposefully written on non Intel cpus so it could be possible to not be tied. 1st with mips, then PowerPC with NT 4, then Alpha with Windows 2000, and server xxxx were made on Itaniums and always backported to X86.

          It is the applications which is why Windows is around.

          If everyone ported all their applications to metro APIs then an ARM wouldn't be a problem

          • Problem is that Microsoft had that golden opportunity in the 90s to build a platform independent ecosystem, not just architecture, but Gates (not you) was more interested in the partnership w/ Intel. Therefore, Microsoft deliberately ignored the RISC platforms of NT. Since MIPS and Alpha were 64-bit workstation class CPUs, Microsoft could have developed a 64 bit NT on those and built an ecosystem that had portable apps. That would also have helped them migrate more decisively to 64 bit Windows. That wou

            • MS did port Windows 2000 to Alpha 64bit, but it was only for internal use. No idea what it looked like : full GUI with Minesweeper and Solitaire?, or just a kernel and stuff and text output to a serial port.

              But in late nineties when your typical dekstop had 16MB to 64MB memory I'm not sure that 64bit was all that needed. Also, by early 2000 you could get a dual Pentium III with lots of MHz and a Geforce 256 graphics card ("workstation" graphics card such as 3D Labs were still around too, for the PC). So I'm

              • Those would have been niche platforms for niche applications, such as AutoCAD, Pro-Engineer, Mathcad, OrCAD, VHDL, Verilog, et al. For those sort of things, the Alpha would have been great. And NT on MIPS could have run applications similar to the ones running on Silicon Graphics workstations. The market wouldn't have been the same as desktops, but having a full blown Office would have helped as well. But it would have been a good dev platform for building today's apps.

                However, I do think that Intel w

            • by Junta ( 36770 )

              The thing is just having the ports wouldn't have been enough anyway. At the time there was no strategy for cross-platform executables (no OSX-style multi-arch binaries, no java-like bytecode thing that was yet in a suitable shape to displace native applications of the day....

            • NT on Alpha was popular for cad apps like lightwave. I had Windows 2000 rc3 for the alpha. My school used them as they were fast.

              NT PowerPC ran on some IBM stations too.

              The movie Titanic was made on alphas with NT and Linux. Visual studio and office were ported over and so were server products. Exchange 2003, sqlserver2005, and server 2003 were all demoed on Itanium systems back in 2003 for benchmarks.

              I can't fault them. I fault the phb and developers who wanted everyone to take the risk 1st before porting

              • Ok, but that was a niche. It could have spread out more, but didn't. I do blame Microsoft - like you said, it was just a question of recompiling, and if it wasn't, Microsoft could have worked internally to make sure that VC++ and Visual Studio worked thoroughly w/ MIPS and Alpha. You are mistaken about Office being ported - while Word and Excel were, Access wasn't - and that would have been one huge thing they could have had there. PowerPoint too.

                My point is that Windows would have been as native to A

    • Please tell me that the OS they are running is 32-bit. I can see the OS eating just about all the storage if it's 32 GB: already on my Winbook w/ that configuration, I've set up everything to save on the 64GB SD card.
  • I want my next TV not to do anything on it's own. I only need a switch between the 4-5HDMI inputs it needs.

    Putting OSes which wont be supported in a few years and apps which are there just for advertising the current generation of TVs into a device which easily lasts 5-10 years in nothing but planned obsolescence.

    • what you're describing is called a computer monitor.

      • what you're describing is called a computer monitor.

        I'm surprised there aren't a lot of 'monitor only' media products. No need for mine to have speakers or smarts, just on/off and video processing & picture adjustments. In fact, I'd like to see a thinner product that moves all the tuner/video processor/inputs to a separate box.

        • so like a samsung TV + samsung evolution kit? They release a new evolution kit every 1 - 2 years to give the TV support for newest cabling standards and codecs. (that's about all i know about it, so don't ask me additional questions)

          i think this is a step in the right direction; if it does what i'm hoping it does, i would no longer be afraid to buy an expensive TV. at the moment, i refuse to buy something that won't support standards common in 2 years. this gives one a path of upgradability.

          • so like a samsung TV + samsung evolution kit? They release a new evolution kit every 1 - 2 years to give the TV support for newest cabling standards and codecs. (that's about all i know about it, so don't ask me additional questions)

            i think this is a step in the right direction; if it does what i'm hoping it does, i would no longer be afraid to buy an expensive TV. at the moment, i refuse to buy something that won't support standards common in 2 years. this gives one a path of upgradability.

            Kind of like that, but take it out of the monitor cabinet and move it to a box to place by the AVR.

        • by Junta ( 36770 )

          I agree with the sentiment, but the incremental cost of the 10-15 dollar board to be 'smart' is next to nothing, the cost of maintaining multiple SKUs far outweighs the savings that would be had by skipping it. The mass market not being able to start 'netflix' out of the box can be a severe competitive disadvantage if a vendor actually skipped the concept entirely for a product.

          I just ignore the existence of the DLNA/netflix app/etc my TV has (terrible experience anyway). It works as a 'dumb monitor' just

          • You misunderstood. The smarts can still exist in the box where the tuners are and the video processing occurs. The only piece that would be eliminated is the speakers.
            • by Junta ( 36770 )

              But again, the cheap speakers they put in are nothing compared to contending with clients not understanding and having to maintain different models to cater to both (and the exchanges when someone realizes they got the lesser one and wanted the speakers and such)

              • I think there is a good market out there for a very thin monitor driven by an auxilliary box with only one thin cable between the two. Yes, there would be a chunk of the market that just wants an integrated TV, but even Joe Average these days is often using an AVR and/or soundbar. The high end market is competitive, there is room no doubt. There are also a crapload of housewives that would like a thin monitor flat on the wall.
      • Yes, I too would like a monitor-style TV (loads of inputs, *no* built-in tuners or even built-in audio), but you wouldn't suggest an actual computer monitor because the price increases exponentially once you go beyond a 24" monitor.

        Dell's 55" computer monitor ("only" 1080p!) is over 1,000 pounds ($1500) in the UK, whereas a 55" 1080p TV can be had for little as 400 pounds ($600).

        • There is a state mandated "television license" attached to TVs sold in my country (and most other countries in the fascist union, I'm not sure there is a EU country without this?). I know this is true in the UK. This gives actual computer monitors somewhat of an competitive edge over TVs if you do not already have a TV (and already have to pay the propaganda license you're likely never using). I can basically buy a TV and be forced to pay the regime for nothing or just use computer monitors and "save" enou
          • by Teun ( 17872 )
            First, there is nothing fascist about most EU countries and certainly not about the EU itself.

            And no, it's not everywhere you pay a TV licence, here in The Netherlands it was scrapped years ago and the money for the public broadcasters is now coming from the state budget.
            As has been said for so long, checking on the licences was expensive and virtually everyone watches TV anyway.

            Now, I believe a lot of people will agree with me that the BBC is doing a damn good job with the rather high licence money th
    • by kkwst2 ( 992504 )

      I think there is no financial motivation to do this. The hardware really takes up no significant space and for a high end TV is a trivial fraction of the cost. It is a small minority of people who would care about this. So you're going to get your Smart TV and like it.

      I don't really find it gets in the way. I never use the Smart TV features but I never see them either.

    • I completely agree with those who say that a (big) display with a lot of connectors (no "smart" crap, no speakers, no nothing but a display) is preferable. I never bought a television and enjoy not having one. I have been saying that some big corporation should simple and affordable 40" or even bigger computer monitors/"TV" for years since there obviously is a market. But what do they do? Add more crap to their propaganda devices.

      Putting OSes which wont be supported in a few years and apps which are there just for advertising the current generation of TVs into a device which easily lasts 5-10 years in nothing but planned obsolescence.

      I tell you: Utterly Stupid and Easily Compromised Things is just the beginning

    • Good solution would be to have something like Minix that could be programmed to do just what you want, and that's it. Not too big, and therefore won't eat huge resources.
  • Just last week, a friend gave me an old PC which he had used as a BSD file server. It has a Celeron 430, 1.8 GHz. I threw in 4 gigs of DDR2 and an old 128 gigs Kingston SSD.

    Could anyone tell me how this stick compares to the above machine?

    • I suspect you can't put that PC in your pocket.

    • by xiando ( 770382 )
      > Just last week, a friend gave me an old PC which he had used as a BSD file server. It has a Celeron 430, 1.8 GHz. I threw in 4 gigs of DDR2 and an old 128 gigs Kingston SSD.

      1) The PC-on-a-USB-stick does not have the SATA connectors
      2) The PC-on-a-USB-stick stick does not let you add PCI-e cards.

      And that is why you can't use a PC-on-a-stick or more importantly an ARM device as a home firewall or NAS server and so on. The Celeron 430 is likely preferable for a whole lot of use-cases.

      The advantage o
    • It's a "Core 2 Solo" you have there. A fast single core PC is not bad at all, except perhaps with Windows + antivirus + crapware running.
      Even multitasking is not a problem, unless you run a CPU hog. What made me move away from single core was how installing an OS in a VM, with the CPU at 100% for every I/O made the PC run like crap (after installing though, Virtualbox guest additions remedy that by a fair amount).

      btw it's Celeron 430 not Celeron M 430. Celeron M has 533MHz FSB, 512K L2 and is 32bit, Celeron

  • Why would anyone design a small plastic shell with a narrow airspace inside that requires what must be the world's tiniest, cutest and probably dust-prone fan, that spins up "fairly often" says the article, to provide obviously insufficient cooling for longevity since it's "warm to the touch"...

    And side connectors (likely) PCB mounted. Yeah, those tiny high strain connectors that tend to spread out or pop off, rendering expensive modern devices still functional but useless...

    All this INSTEAD of making the w

    • by petes_PoV ( 912422 ) on Saturday January 23, 2016 @09:46AM (#51356611)

      Please explain like I'm five

      OK, i t ' s . . t o o . . e x p e n s i v e.

      The plastic enclosure can be stamped out for a penny a piece. The aluminium one would cost more. The device only has to last as long as its warranty period. No reviewer is going to have the device for more than a few days before they write their glowing, uncritical and simplistic reviews (basically: it's shiny, buy it) so the chances of one failing is minimal.

    • by Junta ( 36770 )

      It's a cheap throw-away device.

      All connectors are PCB mounted. The question is whether it is surface mount or through mount. One would hope they wouldn't surface mount the connectors, though I have seen that done. Considering the metal shroud around most computer connectors provide just EMC shielding and rarely ever structurally reinforces them (some blind mate connectors are sometimes a bit more reinforced), this has in practice worked.

      So the biggest question that I would have is whether they could have

      • It's a cheap throw-away device.

        People keep saying that while I see price ranges from $88-$150.

        Is this cheap as in unit prices will go down until anyone who bought one in the last three months will be really pissed? Or cheap as in that $90 bluetooth headset I bought where they said the battery was not replaceable and by the way it only lasts three years? Or cheap as in those burning hot Chinese power supplies that make me question whether UL is on the take? Or cheap as in rich people kinda thinkit cheap?

        I' actually am looking for a low po

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...