Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Power Cellphones Communications Handhelds Patents Hardware

Student Invention May Significantly Extend Mobile Device Battery Life 160

imamac writes with this excerpt from news out of Carleton University: "Atif Shamim, an electronics PhD student at Carleton University, has built a prototype that extends the battery life of portable gadgets such as the iPhone and BlackBerry, by getting rid of all the wires used to connect the electronic circuits with the antenna. ... The invention involves a packaging technique to connect the antenna with the circuits via a wireless connection between a micro-antenna embedded within the circuits on the chip. 'This has not been tried before — that the circuits are connected to the antenna wirelessly. They've been connected through wires and a bunch of other components. That's where the power gets lost,' Mr. Shamim said." The story's headline claims the breakthrough can extend battery life by up to 12 times, but that seems to be a misinterpretation of Shamim's claim that his method reduces the power required to operate the antenna by a factor of about 12; 3.3 mW down from 38 mW. The research paper (PDF) is available at the Microwave Journal. imamac adds, "Unlike many of the breakthroughs we read about here and elsewhere, this seems like it has a very high probability of market acceptance and actual implementation."
This discussion has been archived. No new comments can be posted.

Student Invention May Significantly Extend Mobile Device Battery Life

Comments Filter:
  • Counter-intuitive! (Score:5, Insightful)

    by 4D6963 ( 933028 ) on Friday December 19, 2008 @07:43PM (#26179879)
    Wow, is it me or does it feel profoundly counter-intuitive that you'd lose more power over the wire than over radio waves?
    • Re: (Score:3, Interesting)

      by Cylix ( 55374 )

      I don't think he separating the amplifier from the antenna, but perhaps feeding the amplifier directly attached to the antenna. The loss in signal from source to antenna from the distance of the run has to be made up. This is done by stepping up the output of the amplifier stage.

      This configuration isn't uncommon and many microwave systems employ this technique. (Attaching the amplifier nearly directly to the antenna.)

      Though I would have to look a bit at the design this is only item I can think of. From near

      • by Anonymous Coward

        "This configuration isn't uncommon and many microwave systems employ this technique. (Attaching the amplifier nearly directly to the antenna.)"

        I agree, it sounds very much like some kind of Impedance Matching technique where the Inductive coupling is direct to the antenna. I'm not so sure that's as patentable as this University is drumming it up to sound. (I guess they hope to earn a lot of money from it, mainly from from phone companies). But Impedance Matching using windings to effectively wireless couple

    • Re: (Score:3, Interesting)

      by linzeal ( 197905 )
      There is many an order of magnitude more atoms in the tracing on the PCB than comprise the air the radio waves travel through from the antenna on the cell phone to the cell tower. There are even less when we are talking a matter of mm. The more atoms you have to push your information through the more amperage it takes to overcome the resistance [wikipedia.org] and since radio waves are a form of EM radiation they follow similar laws which just appear more complicated [wikipedia.org].
      • Re: (Score:1, Interesting)

        by timmarhy ( 659436 )
        umm doesn't air have a lower conductivity than copper, hence electricity runs happily along copper at low voltages but needs 1000 volts to jump just 1 cm through the air? TFA is hopeless, it almost sounds like he cut the wires on his iphone, which stopped it transmitting then declared a major break through in battery life.
        • by sillybilly ( 668960 ) on Friday December 19, 2008 @08:59PM (#26180473)
          We're talking superhigh frequencies near 1 GHz. At such frequencies all of the electric/magnetic field generated "current" runs on the surface of wires anyway, not through the bulk, due to "skin effect". Or the electric/magnetic field can simply propagate through free space as electromagnetic radiation, like microwaves in your microwave oven, or light through empty space. Light propagates better through vacuum than through a copper wire, doesn't it?
        • by Anonymous Coward on Friday December 19, 2008 @09:00PM (#26180477)

          yes, once we figure out how to overcome the resistance quality of air I envision a new age where we can have wireless like youtube service.

          I will call this great thing television.

          • Re: (Score:3, Funny)

            by theaveng ( 1243528 )

            "Thief! You can't have television unless you pay Comcast $50 a month. You can't just pull television off the air!"

            So said my clueless neighbor when I said I get TV for free.

            (shaking head)

        • The issue isn't electrons and photons through the air vs. electrons and photons through a wire; it is photons through the air vs. electrons and photons through a wire.

        • Comment removed based on user account deletion
      • by Plekto ( 1018050 ) on Friday December 19, 2008 @08:21PM (#26180173)

        They also do this in recording studios. It takes far less power and wiring(or can be done via RF or IR) to have each speaker have its own small amplifier than to try to power the whole room with a rack of giant units.

        This also would create less interference, believe it or not, since running wires near live electrical components(even the tiny components in a circuit board make a difference - just stick an AM radio near your computer's motherboard) tends to cause interference. This is the other reason recording studios do this. They can run a very heavily shielded or wireless line level signal to each speaker directly. Less power, less clutter, less interference.

        • by Anonymous Coward on Friday December 19, 2008 @08:36PM (#26180289)

          Powered speakers are popular because it gives monitor manufacturers a way to make line level crossovers, power amps and speaker drivers work together.
          Having control over the specifications of all those components means better fidelity. It is tidier too.

          I don't think RF or IR is ever used with studio monitors. They would cause phase alignment problems and a loss of fidelity. Simpler is better, so people use wires. Anyway, aren't we trying to avoid RF transmitters here?
          Speaker cables can be shielded too, but people don't bother as any interference would be imperceptible.

          Power loss in speaker cables is pretty tiny too. Powered speakers really are all about convenience and potential better fidelity.

          • by Plekto ( 1018050 ) on Friday December 19, 2008 @09:08PM (#26180523)

            They use wireless just fine with mics and pickups and so on on stage for these reasons all the time. Less cables, less problems, and also if you've ever had to deal with grounding issues, wireless or a line-level signal that's amplified at the source is a huge improvement. I suspect that's the real problem here - too much background RF noise from the components. Rather than brute-forcing it, he decided to find a way to get around this and clean up the signal in the process.

            Btw, most pros don't use wired mics any more. Too many issues. Most studios don't use non-powered speakers any more, either. You're right - I haven't found many setups that use IR or wireless(yet), but I can find many professional systems that use S/PDIF, optical, or other non-analog transmission methods.(shoot, most home theater interconnects are now HDMI for exactly these sorts of reasons.

            • by Anonymous Coward on Friday December 19, 2008 @09:33PM (#26180703)

              Not true.

              Wired mics sound better because they lack the companders involved in transmitting the audio signal. Performers like wireless because it's convenient, not because it sounds better. Those concerned with sound quality stick to wired.

              Balance signals use common mode rejection to eliminate induced noise. This has been standard practice for years. Recording studios used either balanced wiring, or digital in the form of AES or optical ADAT.

            • I'm lost on how the antenna in a phone is a major power consumer. Aren't the screen, power converters, CPU and all the modulators in the radios each consuming more power than the wire that connects the transceiver to the antenna? If it's really consuming that much power, then it stands to reason that wire should burn up.

              The article is short on details and so poorly worded that I think the article should not have been published. Even if it's valid, the writing makes it look like pseudoscience.

              • by Plekto ( 1018050 ) on Friday December 19, 2008 @09:54PM (#26180827)

                The problem is that the antenna isn't a major power consumer. It's that the signal path between the circuitry and the antenna is so full of junk on many models due to poor slapped-together designs that the signal must be boosted a lot to communicate with the local cell phone tower. In the old days this wasn't a problem as there weren't major limits on power. Some old Analog units transmitted as much as 10-20W!. Now they have to limit their power to a fraction of that. If the digital signal can't be boosted enough to communicate and it's already at that FCC imposed limit, you're out of luck. No bars. Technically you never actually get "no bars" - you just get too little for the error correction to work any more.

                • by floodo1 ( 246910 ) <floodo1@nosPam.garfias.org> on Saturday December 20, 2008 @02:00AM (#26182151) Journal
                  It's not so much that the path between circuitry and the antenna is so full of junk because of poor designs, it's because prior to this "discovery" no one knew how to get rid of that junk.

                  Now this guy shows us a way to bypass all that and gain the efficiency of removing all those components so that less power is used to get the same amount of radiation out of the antenna.
        • Re: (Score:2, Interesting)

          by Anonymous Coward

          Powered speakers exist because it reduces the cabling between the amp and speaker to a minimum thereby reducing the resistance to a minimum and subsequently maximising the damping factor.

          Whilst active monitors are common in smaller control rooms, particularly broadcast/post production and smaller music studios you will still find passive monitor/discreet amplifier configurations in larger control rooms particularly music studios.

          Only a masochist or someone who mistakenly thinks it's easier to screen out int

      • Re: (Score:3, Interesting)

        by BitZtream ( 692029 )

        How did this get modded insightful?

        Its wrong on so many levels.

        First, you've confused voltage and amperage.

        Second, electricity moving through matter is technically a flow of holes where atoms are missing electrons. You get more resistence when dealing with electricity in this form, fewer atoms equals more resistence since there are few atoms available to make hole swaps with. The skin effect when operating at high frequencies makes the effective resistence of PCB trace higher than direct current but still

    • by crowtc ( 633533 ) on Friday December 19, 2008 @07:57PM (#26179981)
      I'm not an antenna designer, but by the looks looks of it, the design is basically a miniature on-chip waveguide, efficiently channeling the RF energy toward the external antenna, minimizing wasted radiation.

      Wires radiate RF like mad unless they're heavily shielded, which is something you really can't do effectively in tight spaces. Of course, testing was done at 5.2GHz, so it will be interesting to see how it works at cellphone frequencies - packaging size might become a factor at lower frequencies.
      • Re: (Score:2, Insightful)

        by thebes ( 663586 )
        Umm, dude...just because you shield a component doesn't mean it stops radiating. Shielding inhibits EM fields which are already present. To reduced radiated losses, you need to either improve the fundamental design of the circuit or make it radiate so well that you build an antenna instead.
    • Re: (Score:1, Informative)

      by Anonymous Coward

      From the article:
      "The strategy is useful as it eliminates the need of isolating buffers, bond pads, bond wires, matching elements, baluns and transmission lines. It not only reduces the number of components and simplifies SiP design but also
      consumes lower power."

      Less compenents = Less power?

      • >>>Less components = Less power?

        That was my initial guess. Electrical circuits include a lot of "glue logic" like resistors, caps, and inductors which burn-off energy as heat. Find a way to eliminate those items (i.e. connect the antenna wirelessly) and you eliminate waste.

    • Re: (Score:2, Informative)

      by e9th ( 652576 )
      From the research paper: [mwjournal.com]

      The conventional LTCC package provides 3 times more range than the proposed design but consumes 12 times more power.

      So you save power versus the conventional design, but you lose range.

      • So? You just bump the power up on the new design. 3^2 = 9, so the new design is actually claimed to be 33% more efficient.

        Still, that's not zero percent.

        • by arth1 ( 260657 ) on Friday December 19, 2008 @11:40PM (#26181467) Homepage Journal

          Except that omnidirectional range is proportional to the cube of the output.
          If, as the GP says, you use 1/12 the power of a conventional device with this design, but have 1/3 the range, you need to bump the power to 3^3/12 of a conventional device to get the same range, or 27/12, or more than double.
          That doesn't seem like a win to me.

          You can't violate the first law of physics:: You don't get sumtin for nuttin.

      • by Anonymous Coward

        From the research paper: [mwjournal.com]

        The conventional LTCC package provides 3 times more range
        than the proposed design but consumes 12 times more power.

        So you save power versus the conventional design, but you lose range.

        To provide the same signal strength at triple the range, you need to broadcast 9 times as much power. To broadcast 9 times as much power with an equally compact transmitter, is it surprising that you need to spend 12 times as much power due to size/efficiency trade-offs?

        This doesn't sound like an advance at all.

        • by mako1138 ( 837520 ) on Friday December 19, 2008 @10:24PM (#26181015)

          You are assuming an isotropic emitter, where field strength falls off as 1/r^2. That behavior is invalid for other antennas; for example a dipole's field strength falls off as 1/r (in the far-field approximation). The paper is complicated by the fact that the radiation patterns of the antennas used in this paper are directional and different. The "conventional" chip used a folded dipole with a "boresight radiation pattern", and the "proposed" chip used a custom design with a front-to-back ratio of 10dB.

          Table 1 has the numbers:
          Module Type / Power Consumption / Gain / Range

          Standalone
          TX chip / 3.3 mW / -34 dBi / 1 m

          TX chip in
          conventional
          LTCC package / 38 mW / -1 dBi / 75 m

          TX chip in
          proposed LTCC
          package / 3.3 mW / -2.3 dBi / 24 m

          Let's do some reckless hand-wavy extrapolation. The difference in power is 38/3.3 = 11.5 = 10.6 dB; if we assume perfect scaling of the new package to 38mW, we'd expect 10.6-2.3=8.3 dBi. This is an improvement of 9.3 dB over the conventional method -- it's almost 10 times as efficient.

          This analysis ignores, among other things, the relative directionalities of the antennas. I wonder why they didn't choose a more directional antenna for the "conventional" chip, or used the same sort of antenna in order to do a level comparison.

          The other point of comparison is between the "standalone" chip and the "proposed" chip. A 32 dB improvement with no power increase is nothing to sneeze at!

          • by Jott42 ( 702470 ) on Saturday December 20, 2008 @06:25AM (#26182937)
            You can not get any gain in an on-chip antenna at this frequencies: it is to small. He is comparing the use of only an on-chip antenna, which is never used in mobile phones, with the use of a coupled external, somewhat bigger, antenna on a ceramic substrate. Not at all suprising that he gets a better performance with the latter, as it is bigger. He would get even better performance with a classic mobile phone antenna, though.

            I.e. This will not revolutionize the battery life of your iPhone or Blackberry. The losses in the coupling between the integrated PA and the antenna are very small (if we disregard detuning due to human proximity effects. Which is another story, and which is not influenced at all by the design in question.)

            The comparison between two different antennas at different powers is not very good science - it is somewhat suprising it got published. (But it is only at a small conference, so it is not that surprising.)
          • by Jott42 ( 702470 )
            The real strange thing is why they didn't compare their new capacitive coupling with a classic wired connection between the PA and the antenna. Instead they introduce an additional PA with corresponding power consumption when they test the wired connection.

            The difference between the standalone chip antenna, with a maximum size of 1 mm, with the proposed antenna, with an size of 17 mm, is not revolutionary, it is expected due to the very bad efficiency of electrically small antennas.
    • by Anonymous Coward on Friday December 19, 2008 @08:59PM (#26180465)

      No, you don't get it: wires is how you lose power. Try disconnecting your battery and see how long it lasts then!

      In fact I should do my PhD on that.

    • Re: (Score:3, Informative)

      by camperslo ( 704715 )

      The summary is misleading.

      The paper describes a method of simply and efficiently coupling energy from the transmitter VCO chip to the main antenna, making good use of the R.F. energy that chip provides. It seems that most of the power savings is from avoiding (power used by) an external buffer amplifier by eliminating the amplifier.
      That's great if the chip can provide sufficient output power, and if the spectral purity is good enough to comply with F.C.C. or other requirements. I'd expect that most cell p

    • by Jott42 ( 702470 ) on Saturday December 20, 2008 @06:42AM (#26183003)
      Yes, it is counterintuitive. And also not what is actually claimed in the paper.
      In the paper three designs are compared:

      (1) One with only an antenna on chip. That is, an antenna on the actual chip, with a size of 1x0.5 mm. Draws 3.3 mW, "range" 1m.
      ("Range" is a very strange measure in RF design...) (2) The same chip but without the on-chip antenna. Instead the power is coupled to an additional PA-amplifier, and an external small folded dipole antenna: Size about 16x10 mm. Draws 38 mW, "Range" 75 m. (3) The same chip withou the PA, with the on-chip antenna coupling to an external patch antenna of size 17x17 mm. Draws 3.3 mW, "Range" 24 m.

      In summary: Nice engineering work, but no conclusions can be drawn, as it is very much a case of apples and oranges. (No constant TX power, No constant size, Not very much constant between the designs at all.)

      And a classic mobile phone does not use an on-chip antenna at all. So this design will not give any benefit to your iPhone or Blackberry etc.
  • I like the idea of using my iPhone for days at a time between charges. Heck, maybe would provide enough battery for a useful iPhone/GPS unit.
    • I doubt that the antenna makes up the majority of your iPhone's power usage.
  • I think this joker hit the '+' button when he meant to hit the '-' button. 12 times. I don't think so.
  • by WaxlyMolding ( 1062736 ) on Friday December 19, 2008 @07:51PM (#26179919)
    ...until you consider the security ramifications.
    • by narcberry ( 1328009 ) on Friday December 19, 2008 @07:57PM (#26179969) Journal

      Yeah, he'd basically short-range broadcasting his long range broadcast. If you got within several feet of him and used the right equipment, you might be able to listen in on everything he's broadcasting!

    • Re: (Score:1, Funny)

      by Anonymous Coward

      The ramifications of sending data a short distance to the antenna, which is then relayed a much longer distance to the base station...yeah, I'm sure those hackers are gonna pull your data off your antenna from this connection rather than the antenna's connection to the tower

    • by lysergic.acid ( 845423 ) on Friday December 19, 2008 @08:24PM (#26180207) Homepage

      what are the security ramifications? that a 3rd party might be able to intercept the wireless transmission just like they already can? whether you use this technique or not, you're still going to be broadcasting the signal wirelessly. that's why GSM signals are supposed to be encrypted.

      the GSM encryption was broken earlier this year [forbes.com]. the security ramifications of that are far more serious. why would you be worried about someone intercepting this weak wireless signal when attackers can already eavesdrop on your conversation from miles away?

      heck, if they're close enough to intercept this signal, then they're already within earshot of you. they wouldn't need to intercept the wireless signal to the antenna. anyone silly enough to do so would look rather conspicuous standing there with a laptop and a directional antenna pointed at your phone.

  • What? (Score:1, Insightful)

    by Anonymous Coward

    The explanation given on the website is very poor. The resistance of the wires connecting the transceiver and the antenna is low and little power is lost in them.

    In addition, they quote him as saying "There are so many applications in the iPhone, itâ(TM)s like a power-sucking machine" but what they're talking about is the power lost at the antenna and not from the processor which is what he implies. Therefore it wouldn't do anything to prolong battery life when using non-transmitting applications.

    Perha

    • Re: (Score:2, Informative)

      by evanbd ( 210358 )

      Definitely bad journalism. The culprit isn't wire resistance, it's reactance. The impedance mismatch at the junctions from amplifier to circuit board to connector to cable to antenna all create reflections and thus standing waves [wikipedia.org]. The power that goes into those standing waves is reflected back into the amplifier, where it is dissipated as heat. The result is that you need (in his example) a 38mW amplifier in order to get 3.3mW of radiated power out of the antenna.

      What his invention does is create a near

      • Can't that be fixed by better wiring?

        I'm sure the cell phone engineers aren't idiots, the impedance mismatches in existing phones will be minimal.

      • Re:What? (Score:5, Informative)

        by thebes ( 663586 ) on Friday December 19, 2008 @09:50PM (#26180803)

        Oh my god. Please not another "informative" post. I really wish you people would stop commenting on these articles when you clearly have no clue what you are talking about. The reflected power (if it happens to exist in this case...which it doesn't because these transmitters are designed quite well and usually include a circulator or isolator at the output of the amplifier to ensure an excellent match) does not go back into the amplifier, because if it did the amplifier would not work as it was designed and would either oscillate or produce extremely poor waveform quality at the output.

        Now, if you can bypass the circulator/isolator I mentioned above (which is what I gather they are trying to do in this article) then that is one less place power can be lost on the way to the antenna.

        • Re:What? (Score:4, Informative)

          by John Hasler ( 414242 ) on Friday December 19, 2008 @10:00PM (#26180863) Homepage

          The article is crap. The paper, however, makes sense. Read it.

        • Actually, a Tesla coil works on very similar principles, and the power coupling in them is very efficient.

        • by evanbd ( 210358 )
          The amplifiers in question are linear amplifiers. A linear amplifier has maximum efficiency for a resistive load. A properly impedance matched antenna appears resistive at its design frequency. An improperly matched one has a reactive impedance component (and an elevated VSWR to go with it). The reactive nature of the load decreases the efficiency of the amplifier. Whether you want to say the power is reflected back into the amplifier or never leaves it in the first place is a matter of semantics. Of
          • Re: (Score:2, Informative)

            by thebes ( 663586 )

            Oh please, another software engineer? Amplifiers are by their very nature non linear devices as a whole (they just happen to have a linear region which we can make use of). The amplifiers in question are operated within their linear region as much as possible where possible, but certain requirements like efficiency force the designers to drive the transistor partly into its non-linear region (closer to P1dB). Some non-linearity is tolerated and is dictated by the FCC, ETSI or CRTC in the form of emissions m

      • by hughk ( 248126 )
        How will this work with multiple frequencies? My phone speaks on the good old GSM band (800/900MHz) as well as 1.8GHz and 3G (2.1 I think). I would have thought this kind of coupling very sensitive to the wavelength needing either a narrowish range or multiples.
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Friday December 19, 2008 @07:52PM (#26179933)
    Comment removed based on user account deletion
    • Re:I don't get it. (Score:5, Informative)

      by Ungrounded Lightning ( 62228 ) on Friday December 19, 2008 @08:21PM (#26180181) Journal

      He's using a waveguide coupling to launch the wave to an external hunk of waveguide, rather than running it through pins, wires, PC board traces, etc. The latter are very lossy at cellphone frequencies.

      (I'm working on something similar right now and lose virtually all my signal going through about 6" of PC board wiring. B-( )

      • WTF? Even FR-4 only loses about 1 dB/inch at 8 GHz! Spend the bucks for better board material.

        This whole article makes no sense at all. Matching networks are not especially lossy at cellphone frequencies.

    • Re:I don't get it. (Score:5, Interesting)

      by inca34 ( 954872 ) on Friday December 19, 2008 @08:28PM (#26180223) Journal
      "The on-chip antenna feeds the LTCC patch antenna through aperture coupling, thus negating the need for RF buffer amplifiers, matching elements, baluns, bond wires and package transmission lines."

      From the systems perspective he made a better RF transmitter block. Digging into that block and looking at the RF design level, he simplified the circuitry normally used such as a matching network for the antenna, transmission lines, oscillator (for modulating the information over the carrier frequency), etc into a discrete chip as opposed to multiple printed circuit board components to do that same job.

      Beyond that I'd need to study the paper and find more detailed examples of cell phone architecture to have a better idea of the advantages and disadvantages over the legacy design.
      • Re:I don't get it. (Score:5, Interesting)

        by TigerNut ( 718742 ) on Friday December 19, 2008 @09:16PM (#26180565) Homepage Journal

        Nevermind that he's apparently ignoring the true cause of a lot of the "lost" power - which is in the various bandlimiting filters that any real cellphone pretty much can't do without. It's tough to get a good multiband filter that doesn't have 1 to 2 dB insertion loss. The apertures are also geometric, so you are automatically sensitive to odd-order harmonics in both directions.

        And I wonder how his aperture's impedance matches the amplifier out of band? From what I've seen in bleeding-edge RF architectures over the last 20 years or so, it's far easier to make a poor oscillator than a good amplifier, with any given set of components.

        • Actually, what I think he's doing isolating the oscillator while impeding the capacitive antenna, all the while the couplings' reactance which is usually between 1.85 GHz and 6.1 dB/mW is going to undergo a radical departure from its aperture (commonly also acting as the modulating amplifier) while the multiband waveguide is going to totally remove the need for the baluns. Now of course this won't have any measurable effect on the odd-order harmonics, which are going to continue to radiate (at 50 Ohms) to t
          • But what if we reroute the oscillator's output to the main deflector dish and convert it into a pulsed tachyon beam, thereby ignorng the impedance in the twelve lowest space dimensions? Of course the odd-order harmonics, if not compensated, might open a subspace rift, but if we tune the gravimetric scanning equipment to 139.47 THz we might be able to modulate the warp field to generate matching even-order harmonics perpendicular to the original waveguide, thereby reducing the chance of a catastrophic breach
  • There definitely needs to be more research on Battery life....it's advancing slower than the gadgets which causes a ceiling on innovations!
  • by jriskin ( 132491 ) on Friday December 19, 2008 @08:01PM (#26180015) Homepage

    I mean my phone lasts for days if i don't use it and many hours if i'm just talking. The vast majority of power seems to be used when I'm watching video, playing games, or browsing the web. My guess would be this is more CPU related.

    So even if it saves 10x in the transmit/receive it still might only be a 2x overall savings or less. I suppose it depends on usage patterns.

    • by Kent Recal ( 714863 ) on Friday December 19, 2008 @08:11PM (#26180105)

      I suppose it depends on usage patterns.

      Yes. His approach would only help people who use their phones primarily to *gasp* make phone calls. Blasphemy?

      • Re: (Score:2, Insightful)

        Or use a Web browser. Phones typically communicate with the Internet through the cellphone network over the two-way radio. This might improve WiFi phones, too, as WiFi also (obviously) employs a (much lower-power) two-way radio.

    • Goes double for WiFi, which is an extremely chatty protocol and thus sucks power. Could make WiFi much more usable in smartphones. Right now, if you play with WiFi much, you'll find that your battery gets drained fast as compared to EVDO or the like.

    • by IorDMUX ( 870522 )
      The largest battery hog on your phone is the backlight and screen. After that, you have butt-loads of internal RF processing, and then, at a distant third, the antenna itself. The CPU, PMU, etc., are all eating from the same dish, as well. (I suppose if you have an Intel Atom, though, it would be sitting above the RF processor for power consumption.) My estimation on the increase in battery power would be in the range of low to moderate double-digit percentages, but it depends heavily on usage patterns,
  • by Anonymous Coward

    Last line of the pdf:

    The conventional LTCC package provides 3 times more range than the proposed design but consumes 12 times more power.

    • Re: (Score:2, Interesting)

      by Anonymous Coward
      Exactly. That means that this give exactly zero improvement over the current arrangement. Range goes by the square of power (assuming perfect isotropic radiation). If you reduce the transmit power by 12 times, the range at which the same detected signal level would be measured should drop by a factor 3.46. How is this better? Apples and Oranges. To get a comparison that one is better than the other, they would have to be compared at the same received signal strength at the same range. The fact that t
  • (only a software engineer) ... but when you tell me that replacing copper wires with a (wireless) transmitter and receiver helps save power: well I am a non-believer. Sorry. Just does not cut it whatever the headlines say. How about quality ?

    • by paganizer ( 566360 ) <thegrove1&hotmail,com> on Friday December 19, 2008 @08:24PM (#26180195) Homepage Journal

      For once, something that I'm actually qualified to post on!
      I was a Weapons system depot level tech in the navy, doing lots of work with waveguides, radar, etc. I went on to work in the private sector, doing among other things antenna design at Nortel.
      I can't help but say this is a bunch of shit. It is ALWAYS more energy-expensive to do wireless, it's just the way things are.
      If it is just the journalist making a mistake, I can see some possible advances in energy conservation using a waveguide, or even a virtual waveguide; anything else would only start to be possible if you enter the realm of high energy physics.
      Unless this guy's name is Tesla, and/or they have developed a completely new principle...

      • by Plekto ( 1018050 )

        The real question is how much you have to boost the signal to overcome the interference from the electronics nearby. Since we're talking about a digital transmission, this is very much a factor. Too much background noise and you get garbage at the other end.(not quite like analog wireless). As such, digital cellphones have to boost their signal until they can get a connection. Often, quite a lot, in fact.

        You can see this with a HDTV set and an antenna. Too low of a signal and you get no picture at all.

        • by Kohath ( 38547 )

          It's much more complicated than you understand. All modern wireless communications are analog -- especially the digital ones.

          The AC post is correct.

        • by drerwk ( 695572 )
          All EM signals are analog. What are you talking about?
        • by smoker2 ( 750216 )
          You appear to be talking about the power of the signal between the cell phone and a tower. The article is nothing to do with that. It is regarding the signal between the cellphones transmitter circuit, and the cellphones antenna. Technically you could achieve a similar effect using an led and LDR to send data without wires or traces. But unless it saves power in receive mode as well, it won't help much overall. Receiving always needs more power, as I have found when working with 2.4GHz radio.
      • by Moof123 ( 1292134 ) on Friday December 19, 2008 @09:09PM (#26180529)

        I'm not as qualified as paganizer, as I usually work at much higher frequencies (mmwave). However, losses from the PA to the antenna are typically pretty low. The claim of 12x improvement imply the current interconnects are at best 8% efficient (utter BS!).

        From the PA to the radiated signal you typically have:

        1. On PA losses because of their design. For example they typically have at least 3 different output stages to span from just a few milli-watts (single HBT cell), up to full power (hundreds of milli-watts, hundreds of HBT cells). The parasitics of driving the unused cells at less than full power operation creates small losses, but I don't know a hard number for this.

        2. Baluns/impedance transforms. PA's are typically class B operation with a load line that is just a few Ohms (3V Vcc, and hundreds of mA of DC power, so the RF loadline is pretty steep). Solutions are matching structures, or a push-pull architecture through a balun to transform up to 50 Ohms. These usually account for 0.5-1 dB of loss (10-20%) of power. The invention ignores this part of a cell phones design.

        3. Multi-band switch. Missing in this article is that most phones are designed to operate on at least 2, often 3 frequency bands. Several PA's are used, each designed to cover only one band. A GaAs phemt switch is usually used to switch between the two or more PA die. The invention does not address this aspect of cell phone design. These chips are either integrated in with the PA chip (separate die in the same carrier), or in some cases done in a different chip.

        4. Small line loss from the PA chip to the antenna do have modest loss, usually just a few tenths of a dB (few percent). The article addresses this aspect of things.

        5. The antenna is a clusterfuck of design hassles, as it is often dual, or tri-band in nature. A lot of compromises go on with the antenna. Making it have multiple resonances to cover the bands is hard. Making it small is hard. Making it work with the crappy ground plane, user's hand and head, and technicolor plastic case is damn hard. The article glosses over all this, and talks about a single narrow band antenna scenario.

        • by Plekto ( 1018050 )

          But what about the display, the back lighting, the bluetooth/wifi, the internal speaker... I can think of a lot of things in a cell phone that also cause background noise that must be overcome. Those bare traces on the circuit board are essentially also acting like a microphone for any stray RF signals. The mistake I think is that many people are equating this with analog signals. With RF interference with digital signals, it then falls back to how much you can boost the signal to have the error correct

      • Unless this guy's name is Tesla, and/or they have developed a completely new principle...

        Reversing the polarity of the main deflector??

        • I'm not certain, but I think that just might be crazy enough to work!
          Speaking generally, though, I can see a few esoteric possibilities, but nothing that could do as much as claimed.
          Your main power usage is at 2 points; the display, and the antenna. you can do some amazing things with the display, like making a low power digital paper display for normal ops and leaving your relatively power hungry LCD off until you need something the paper can't handle.
          There is quite a bit of wastage across the EM spectrum

    • (only a software engineer) ... but when you tell me that replacing copper wires with a (wireless) transmitter and receiver helps save power: well I am a non-believer. Sorry.

      You're missing two things

      1) High frequency RF is just plain weird and
      2) This is all near-field stuff; even at 5Ghz a chip package is substantially smaller than a wavelength.

    • A waveguide is far more efficient of a transmission line than coax or any other wireline can hope to achieve. If he's found a way to build a waveguide (or reasonable fascimile thereof) by clever geometry, it could be very efficient.

  • If the circuite powering the antenna was the greatest consumer of power in the device, this would result in a significant improvement to the end-user. However, it's all the other bits in the device which eat thousands of times more power -- the CPU, the display, the speakers, etc.

    Interesting discovery, but the real-world savings will be few.

  • Kind of a misnomer (Score:4, Interesting)

    by SkOink ( 212592 ) on Friday December 19, 2008 @08:34PM (#26180273) Homepage

    I don't think this will "significantly extend" mobile device battery life, As other people have pointed out, something that could practically save maybe 10mW of battery power during transmit operation is interesting but not really all that dramatic. On the other hand, the author doesn't appear to make the claim that it will or won't significantly extend battery life. That may be a slashdottism :)

    If I understood the abstract right, the gist of this is that he designed a transmit module with a small internal loop antenna, so that a larger transmit antenna could be inductively coupled instead of electrically driven. This means that all of the bias and driver circuitry internal to the transmit chip and also all of the bias and transmit circuitry external to the chip could be done away with. He coupled an antenna to the outside of a microchip to utilized what would essentially be 'waste' magnetic field in a conventional transmitter.

    I would also bet that the big boys like Qualcomm probably do something similar already inside of their cell-phone modules. I would imagine that an approach like this eliminates much of the general purpose interfacing that needs to be done between some arbitrary microwave transmit module and some other arbitrary antenna, but things like cellphone transmitter chipsets are so tightly integrated that I bet they already implement something similar.

  • This idea is pretty useless, since it have been confirmed that cellular companies do not truthfully report the amount of battery life left, so people will make shorter calls and not take up the valuable bandwidth of theirs that they oversold.....

  • Student? (Score:5, Insightful)

    by tyrione ( 134248 ) on Friday December 19, 2008 @10:34PM (#26181087) Homepage
    What a horribly misleading title.

    Ph.d candidate... is factual and much less sensationalized.

  • "Yo dawg, we heard you like wireless so we put a wireless antenna in your wireless device so you can be wireless while you're wireless!"

  • Electrical engineering involves an intricate set of tradeoffs. When choosing how to couple two transmitter stages there are at least six basic ways to do it: Direct, capacitive, single-tuned, double-tuned, critcally coupled, overcoupled, tapped, T-section, balun, and many more. The one you choose depends on a lot of factors, efficiency, power level, bandwidth, phase linearity, space, shielding, cost, parts availability, reliability, feedback, adjustability, temperature stability, and more.

    Seeing as there

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...