Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Wireless Networking Hardware

Intel To Include Draft 802.11n In Centrino 67

filenavigator writes "Intel announced at the Globalcom 2006 Expo that they will be including Draft 802.11n hardware in their Centrino chips. It will be interesting since they said that they will start doing this sometime in the middle of 2007, and the 802.11n standard is not to be finalized until 2008. Additionally Draft 802.11n has been dogged by interoperability problems." From the article: "Although the news caused barely a ripple of reaction in the audience of software and hardware engineers, there are industry analysts who have already warned large buyers of wireless technology to resist the temptation to deploy high-speed IEEE 802.11n devices until the standard is ratified."
This discussion has been archived. No new comments can be posted.

Intel To Include Draft 802.11n In Centrino

Comments Filter:
  • Eh (Score:2, Insightful)

    by wiz31337 ( 154231 )
    The only major issues I've seen with 802.11n is the decrease in range and the obvious speed differences. If it is backward compatible with 802.11a/b/g then this should be a big issue.
    • So much for proof-reading... This shouldn't be a big issue.
      • Re: (Score:3, Informative)

        by Sancho ( 17056 )
        The chipsets appear to be backwards compatible with 802.11g. Apple's been shipping draft-n equipment for awhile now, though only marketing it as 802.11g. Seems to work fine on my network.
    • Hold on... A decrease in range? I thought 802.11n was supposed to boost range.
      According to this product description [superwarehouse.com]:
      "By overlaying the signals of multiple radios, Wireless-N's "Multiple In, Multiple Out" (MIMO) technology multiplies the effective data rate. Unlike ordinary wireless networking technologies that are confused by signal reflections, MIMO actually uses these reflections to increase the range and reduce "dead spots" in the wireless coverage area. The robust signal travels farther, maintaining
  • I recall seeing Belkin "pre-N" routers for sale in late 2004/early 2005. (Not that I'm convinced that the average home user needs more than 54Mbps at that time, especially because most broadband connections are still in the 1.5 to 3.0 range. Actually, I'm doing fine with 802.11b still).

    -b.

    • For normal browsing, yes, 11b is all you need. But its actual throughput in my experience is around 1 or 2 mbps, which can limit internet connections (I routinely get 4-6mbps over cable with BitTorrent) but more importantly file transfers over the network like streaming video. And if you use a lot of network storage even 11g is pretty painful.
    • by GotenXiao ( 863190 ) on Tuesday December 05, 2006 @05:44PM (#17120860)
      Why does everyone always assume that wireless networks are only ever used for internet access? Am I forbidden from running VNC to my desktop from my laptop? Can I not transfer files to my wifi-enabled Archos? Streaming media from my desktop to a TV downstairs?
      • exactly! mod parent up.
      • Why does everyone always assume that wireless networks are only ever used for internet access?

        .
        Because that is exactly what the majority of wireless networks are used for. Not that you don't have a perfectly valid point, but you don't sound like a typical user.

        Once more wireless devices become popular (like 802.11 cell phones, streaming media players, printers, etc..), people will start to require faster wireless networks. Right now they aren't required for most users, but here is the catch, they are

    • by ArchAbaddon ( 946568 ) on Tuesday December 05, 2006 @06:10PM (#17121250)
      "Pre-N" was just a fancy marketing ploy be Belkin; their "Pre-N" products was made well before even Draft 1 was released. It is proprietary, and when the 802.11n draft is standardized, will probably not be upgradeable to the standard, and will only be backwards compatible to 802.11g with other wireless devices.
  • by Anonymous Coward
    "news caused barely a ripple of reaction in the audience"

    I misread that as "barely a nipple reaction".
    • by b0s0z0ku ( 752509 ) on Tuesday December 05, 2006 @05:15PM (#17120372)
      I misread that as "barely a nipple reaction".

      aka "802.11n leaves us cold?"

      -b.

      • aka "802.11n leaves us cold?"

        Spoken and moderated like true nerds that don't show their pasty bodies at the pool and really know a nipple. Because, a cold nipple, is a perky nipple [teenadviceonline.org]
        • Spoken and moderated like true nerds that don't show their pasty bodies at the pool and really know a nipple. Because, a cold nipple, is a perky nipple

          "Leaves us cold." Hence, no change from the previous cold state and, therefore, no change in nipple erection. If 802.11n didn't leave us cold, our nipples would rapidly be flattening. As far as the nerd thing: We Have Nipples Too y'realise, so we all know how they work...

          -b.

    • Extensive testing by the FCC and UL has virtually eliminated the possibility of 802.11n-induced biological effects in mammary tissue. The tingling must be something else.
  • I want to pay extra money for something that probably won't work very well (draft spec) and probably won't work at all with other 802.11n devices at all once they adhere to the real spec.

    Now I'm off to buy some more SCO stock!

    • by Threni ( 635302 )
      I want something that works now, not to wait until some standard is agreed which will make no difference to the kit I've already got. I don't care if my laptop won't work with some stuff in 4 years time, because, like all hardware, it will be laughable in 4 years time - that is, if it's working at all.

      • I have a ThinkPad 240 here that would disagree with that.

        300MHz, 128MB RAM, and still capable of playing video, music, etc. What more do you need from a ultralight laptop? And if I need more power, I can always use XDMCP to login to my desktop for more power.
        • by Threni ( 635302 )
          > 300MHz, 128MB RAM, and still capable of playing video, music, etc. What more do you need from a ultralight laptop?

          Enough power to develop .net/netbeans stuff. Those IDEs are resource hogs!
    • i just got my router, bridge, and laptop moved over to 54g, it figures that things would change.
    • Its my understanding that the hardware is unlikely to change between Draft-802.11n and the final 802.11n spec. Once the spec is finalized you'll need to update your firmware. I believe thats what Apple and Intel are counting on. Apple is in bed with Intel at the moment so I highly doubt the centrino chipset will be incompatible with the one Apple put in the C2d MacBook Pros.
  • by hirschma ( 187820 ) on Tuesday December 05, 2006 @05:24PM (#17120510)
    Pretty obvious how this plays out:

    * Intel will become, pretty much overnight, what all of these routers have to interoperate with,
    * Everyone else tweaks their chipsets to work with Intel,
    * Intel's interpretation of the draft standard becomes the standard.

    Yeah, I'm quite sure that the IEEE will do something to rock that boat.
    • by MojoStan ( 776183 ) on Tuesday December 05, 2006 @08:54PM (#17123154)
      * Intel will become, pretty much overnight, what all of these routers have to interoperate with,
      * Everyone else tweaks their chipsets to work with Intel,
      * Intel's interpretation of the draft standard becomes the standard.
      As I said in another comment [slashdot.org] (before reading your "Score:5" comment), "the standard" (draft 2.0, due March 2007) will be set before Intel's chipset (due April 2007) is released. Draft 2.0 will be tested and certified by the Wi-Fi Alliance [arstechnica.com], so Intel will most likely be tweaking their chipset to work with Draft 2.0. In fact, I bet all of the other wireless equipment makers will release their draft 2.0 gear before Intel.
  • by postbigbang ( 761081 ) on Tuesday December 05, 2006 @05:25PM (#17120524)
    1) it's still a draft, and anything can change between now and then (ask Synoptics)
    2) while backwards compatible with G, N requires special antennas (two of the, in diff-mode, so to increase bit-rate); Centrino silicon will be new
    3) even though every fab house is trying to get marketshare in N, there's lots unproven about its future, and which technologies might eclipse it
    4) it thwarts the draft process of the IEEE; but I guess standards will go to those that buy them.

    Many tests have proven incompatibility issues, and the mistakes made. Reserving notebook real estate for a chipset is just a rook move, and nothing more.

    Move along, therefore; nothing but PR prattle to see here.
    • Re: (Score:1, Funny)

      by Anonymous Coward
      Gee, you seem to have a firm handle on this. You really should mail those ignoramuses at Intel who have never thought about these kinds of issues, who don't know much about silicon, and who probably aren't good at keeping up with network technology and IEEE standards.



      I'm sure your valuable insights could save them hundreds of millions of dollars.

  • Obviously there'd be interoperability problems.. the standards not final yet.
  • I always thought that centrino was the general laptop platform, and the chips were pentium-M? Centrino was just a marketing tech term to imply a laptop with certain capabilities which include wifi.

    And if Centrino does literally mean the cpu chip, how the hell do they put a wireless network card IN the chip? Is this just a news report typo, or am I missing something?
    • Centrino means the laptop has an Intel Pentium-M, Intel Core Duo, or an Intel Core 2 Duo and an Intel wireless card such as the IPW2100(802.11b), IPW2200(802.11b/g), or the IPW2915(802.11a/b/g).
    • Re: (Score:3, Informative)

      by javaxman ( 705658 )
      Your information is starting to get just a tiny bit stale, although you're generally correct. "Centrino" can now be Pentium M or Core Solo, and "Centrino Duo" can be Core Duo or Core 2 Duo. [intel.com]

      Actually, they said "chips" not "chip", probably meaning the Centrino platform is made up of a number of ( specified ) chips, and now an 802.11n package is included in the mix. Right now you're still Centrio if you include one of three approved Intel wireless packages [intel.com]... this probably just means they've announced a fou

    • Centrino is a marketing term which more of less mean an intel cpu and an intel wireless chipset . Its an intel marketing strategy which more or less tells oems "We'll advertise this thing, you buy it, there will be demand." And frankly it works. Centrino products tend to be nice little packages.

      Regardless, intel probably sees that it costs little to nothing to build in pre-n tech into their newer chipsets. Businesses and home users both want the supposed greater range and bandwidth. Apple has already done
  • by troll -1 ( 956834 ) on Tuesday December 05, 2006 @05:31PM (#17120662)
    The technology will someday scale to 600Mbps, according to Bill McFarland, a member of the IEEE committee, with a range 50 percent greater than available with Wi-Fi now.

    In physics there's measurement called "skin depth" which is the distance a wave travels before its power level drops by 1/e or about 1/3. The formula is something like (wavelength/2*pi). The FCC regulates the power of 802.11n to something like 1mW per channel. So unless these new chips will have more power than is currently allowed, how can they have a greater range?
    • by b0s0z0ku ( 752509 ) on Tuesday December 05, 2006 @05:44PM (#17120858)
      So unless these new chips will have more power than is currently allowed, how can they have a greater range?

      Better error correction or use of a transmission method that's more robust when faced with a low signal/noise ratio, possibly. With a directional mic and possibly some filtering software, you may be able to hear shouting five miles away, for example.

      -b.

    • by InakiZuloaga ( 1022349 ) on Tuesday December 05, 2006 @06:16PM (#17121320)
      It depends on how well are you able to receive. There's a parameter that is named the receiver sensitivity and that's the lowest power that it can receive and still get the correct data. If you have a receiver circuit with sensitivity of let's say -90dbm and that allows you to reach 0.1 miles, if you have another device that has a sensitivity of -93dbm that will allow you to reach 0.15 miles without changing the transmit power. The sensitivity depends on how noise inmune is your receiver to noise and that depends on the radio standard used.
    • I can't speak for range on its own, but it gets bandwidth in part because it uses the entire width of the band. Instead of three non-overlapping channels, you just get one channel.

      It also gets three antennas, and I think there is some sort of RF interferomerty (sp?) or some such bag of tricks to take three signals and get a better signal than they could with just one.
    • by Andy Dodd ( 701 ) <atd7@@@cornell...edu> on Tuesday December 05, 2006 @08:25PM (#17122820) Homepage
      Ugh, that's the most horrible "let's throw some random terms into my post and make myself look smart" post I've seen in a while.

      Skin depth has ABSOLUTELY NOTHING to do with this. Skin depth determines how far an RF signal will penetrate into a conductive or semi-conductive material (usually metal, often used to discuss RF penetration into water). Skin depth is a function of wavelength - The shorter the wavelength, the shallower the skin depth. Remember, this is a term of RF penetration *into a conductive or semi-conductive material* and is usually measured in fractions of a millimeter for most metals. It can be a matter of meters for water though, which is why submarines usually are contacted via VLF or ELF (very low frequency/extremely low frequency) - skin depth of VLF/ELF into water is pretty large due to the long wavelength. Still, in general, as far as Wi-Fi goes, skin depth is irrelevant and meaningless.

      Freespace RF propagation follows the inverse square law, just like any other electromagnetic radiation.

      That said, indoor wireless is typically NOT free space. The nature of indoor wireless means that a signal can take multiple paths between transmitter and receiver. Unfortunately, these paths can sometimes result in the signals at the receiver interfering destructively with each other, causing a significant reduction in signal strength. The best example you might be familiar with is FM radio - have you ever been sitting at an intersection in your car and the reception of the station you were listening to completely dropped out, only to come back to full strength when you moved your car a few feet? That's classic multipath fading.

      One solution to multipath is to use two or more antennas to provide what is called diversity. Usually, if one antenna is in a "dead spot", an antenna a half wavelength or so away (or closer but with a different polarization) won't be. This is why almost all normal 802.11a/b/g routers have dual antennas and most PC cards and built-in WLAN cards have dual antennas. The card (usually) selects the one antenna that gives the best reception and uses it. (This is called selection combining. There are other diversity techniques that are better than selection combining but a bit more complex.) Some newer cards may use other diversity reception methods to improve 802.11a/b/g performance.

      Now, 802.11n takes diversity to whole new levels. It uses what is commonly called "multiple input multiple output" or MIMO. Fundamentally, MIMO takes multipath and turns it from a disadvantage to an advantage by transmitting different data on each path. Thus, a MIMO system can achieve higher data rates by effectively using multipath to create multiple independent channels.

      I have a paper saved somewhere that describes how MIMO works in detail, but the basics are that if you form a matrix with the complex path gains (i.e. both amplitude and phase) between individual transmit and receive antennas (e.g. t1, t2, t3, r1, r2, r3 for a 3x3 MIMO system) of the form
      [[gT1R1 gT1R2 gT1R3]
      [gT2R1 gT2R2 gT2R3]
      [gT3R1 gT3R2 gT3R3]]

      (BTW, Malda, LaTeX or MathML please? Octave/Matlab format isn't quite the hottest for representing a matrix in human readable form...)

      you can perform operations (I believe a singular value decomposition but my memory could be wrong and it may be another decomposition) on that matrix to form two transformation matrices and a diagonal matrix. The diagonal matrix contains the path gains of three independent pseudochannels (which I believe are either the square root of the matrix eigenvalues, the eigenvalues themselves, or the square of their eigenvalues), and the transformation matrices can be used to take transmissions intended for the three pseudochannels and convert them to actual transmissions/reception on each antenna.

      I'm sorry I did such a crap job explaining this, I really need to find that paper as it does a much better job. :)

      Beware! Many companies have begun calling anything
  • Kiwipedia made it sound like 802.11n didn't really work. There was too much interferance from all the other 2.4Ghz devices. 802.11n over copper was replacing wireless 802.11n in most applications. It was the same modulation but over wire to increase bandwidth. When corporations talk about supporting draft 802.11n, they tend to refer to copper and not wireless.

    • by Gospodin ( 547743 ) on Tuesday December 05, 2006 @06:05PM (#17121202)

      Kiwipedia: Your source for all things New Zealandish.

    • by dgatwood ( 11270 )

      Why would someone do 802.11n over copper? Wired ethernet is at 10Gb/sec. speeds, while 802.11n is only a paltry 100-200Mbps.

      • by Andy Dodd ( 701 )
        And one of the key fundamental differences between a/b/g and n is the use of multiple antennas multipath effects to increase data rate by creating multiple independent data channels.

        Copper is fundamentally single-path unless you use multple copper connections, but at that point channel bonding is a hell of a lot easier than MIMO. :)
  • by blackmonday ( 607916 ) on Tuesday December 05, 2006 @05:51PM (#17120986) Homepage
    They are not advertising it, but Apple's new laptops have pre-n built in already. There is speculation that pre-n will fuel the iTV and its HD capable HDMI port. Don't you love rumors?

  • Apple has apparently slipped 802.11n into some of the currently shipping machines via buffed up Airport cards - gotta love stealth upgrades, eh? :)
  • Looks like the /. editors dropped the ball in the Company Logo Department. Intel's Old Logo (1968-2005) [wikipedia.org] should be replaced with Intel's New Logo (2006-?) [wikipedia.org]
  • There's a certain point in the pre-standard development cycle when you know the range of possible ways the standard can go such that you can implement ANY possible final standard with the same physical radio -but with updated firmware. The trick is not to do it so early that you've guessed wrong and have a radio that cannot be upgraded to the final standard.

    One assumes that Intel will have made sure their N implementation is upgradeable to the final IEEE standard.
  • by Glasswire ( 302197 ) on Tuesday December 05, 2006 @07:19PM (#17122116) Homepage
    ...It's only been 12 months since they changed it to the new one [intel.com]
  • by MojoStan ( 776183 ) on Tuesday December 05, 2006 @08:39PM (#17123014)
    First of all, this (802.11n in next Centrino) is very old news [xbitlabs.com] (Feb 2006).

    More importantly, Intel will in all likelyhood be using draft 2.0 of the 802.11n spec, which is much closer to the final spec than today's crappy "pre-N" stuff (draft 1.0). Draft 2.0 equipment will even be tested and certified by the Wi-Fi Alliance [arstechnica.com] for interoperability.

    Draft 2.0 is due to be ratified in March 2007. Next-gen Centrino (Santa Rosa) is due in April 2007. In the unlikely event that draft 2.0 is not ratified, the Wi-Fi Alliance will put together de-facto standards, which will still be much better than today's current draft 1.0. Any respectable article would mention this very important information.

  • With the widespread adoption of 802.11(pre-)n devices, as consumers are likely to bend eventually to the marketing of all the retailers, even more interference will be caused on 2.4Ghz bands (especially since there are only 3 non-overlapping channels for 802.11{b,g} devices. In the times ahead, 802.11a will become more popular as there is less interference and more channels, even though the range is shorter and has "only" 54mbps bandwidth.
  • From RoughlyDrafted:

    I got some criticism for writing in How Apple's iTV Media Strategy Works [roughlydrafted.com] that I thought Apple's new iTV was going to incorporate 802.11n, the new and much faster industry standard for wireless networking. Some readers thought that n isn't going to be ready in the timeframe Apple announced for iTV's arrival, while others said 802.11g is plenty fast enough to stream video already.

    N: Ready and Willing

    Wireless n is most certainly is going to be ready however. Even if the IEEE doesn't get aro
  • Intel announced at the Globalcom 2006 Expo that they will be including Draft 802.11n hardware in their Centrino chips.

    Whoa, hold on a minute here.

    Correct me if I'm wrong, but I was under the impression that "Centrino" is something of a blanket term encompassing an Intel processor, chipset, and wireless card/chip.

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...