Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Cellphones Iphone Power Hardware

The Care and Feeding of the Android GPU 307

bonch writes "Charles Ying argues that Android's UX architecture has serious technical issues because, unlike the iPhone and Windows 7, Android composites the interface in software to retain compatibility with lower-class devices. Additionally, Android runs Java bytecode during animation, and garbage collection blocks the drawing system. Google engineers dismiss the desire for hardware-accelerated compositing and cite more powerful hardware and an avoidance of temporary objects as a solution to the collection pauses, but Ying argues that will just lead to [a lower] average battery life compared to devices like the iPhone."
This discussion has been archived. No new comments can be posted.

The Care and Feeding of the Android GPU

Comments Filter:
  • by Anonymous Coward

    the interface never seems as polished without hardware acceleration, Just look at Mesa and a full linux desktop running ATI or Nvidia drivers with compiz.

    • I find it funny that the article mentions Galaxy S, and the browser from the recent JPU/X/Y firmware. I've used the JPU/Y browser on a day to day basis, and it's horrible. Pinch-to-zoom is very smooth, and the tiling isn't really an issue, but the page sometimes gets very distorted when panning. I haven't noticed any improvements when scrolling, it's choppy as it ever was. The JPU/Y browser has made me even consider changing to Opera/Firefox/Miren or something else as my day to day browser.

      Sent from my Ga

    • by Daniel Phillips ( 238627 ) on Tuesday January 04, 2011 @04:47PM (#34757890)

      Speaking as a former Googler, the smart people spin is somewhat overrated. Arrogant people is closer to the truth, and "smart" tends to mean "good at avoiding work".

      • Speaking as a former Googler, the smart people spin is somewhat overrated.

        And you say that as an unbiased observer with no axe to grind, right? :-)

        • by Daniel Phillips ( 238627 ) on Tuesday January 04, 2011 @05:43PM (#34758586)

          And you say that as an unbiased observer with no axe to grind, right? :-)

          Right. I still own all my Google shares. However I am now properly disillusioned about a number of Google myths, but don't trust me. Ask any Googler, former or otherwise. In the latter case, make sure to do it out of earshot of other Googlers.

          There are smart people at Google, and if they are really smart they learn early to keep their heads down. This seems to be the main sequence for large tech companies. Microsoft is far advanced on that path and Google seems more than a little determined to follow. The stack ranking system is nearly a carbon copy of Microsoft's, which in turn was copied from GE, and look how well that worked out. The result is inevitable degradation of the engineering culture. Now, warning about the negative consequences is not the same as axe grinding, quite the opposite.

  • Look at Samsung's Galaxy S browser. GPU accelerated and tile-based. I’m told it’s a result of Samsung’s PowerVR GPU optimizations.

    Doesn't that require that the device have a PowerVR GPU on board? What about devices without PowerVR like the NexusOne, does it run on that?

    When you optimize to GPUs, you have to optimize to all GPUs. I realize there are common instruction sets but the main selling point of Android is its versatility. If I start coding for only Snapdragon processors with PowerVR GPUs because it has a better UX, then it sort of destroys any benefit I get from Android and I might as well code for iOS because I know what that hardware will always be. A lot of the benefits of Android applications being Java byte code completely independent of the hardware are overlooked in this proposition.

    The developers don't know what future devices are going to use for GPUs or their instruction sets. From one of the links Romain says:

    New devices might allow us to overcome the past limitations that made GPU support a not-so-good solution.

    Doesn't optimization for particular hardware exacerbate their issues with fragmentation?

    Well, it's open source, there's always the smug answer that Charles Ying can go fork Android himself and add this and watch all the handset manufacturers flock to his side. If you think it's best, get a team together and do it.

    From Ying's article:

    Wake up, Android team. Windows Phone 7 just lapped you [anandtech.com].

    Can anyone tell me why that AnandTech article from March is evidence that Windows Phone 7 has lapped Android? And why it just happened?

    • by Desler ( 1608317 ) on Tuesday January 04, 2011 @03:17PM (#34756914)

      There is this new thing called "conditional compilation" which allows one to include code that both optimizes for certain hardware and contains generic code that could work on all devices. I hear it's the new rage of how you make code work on multiple hardware and software platforms.

      • There is this new thing called "conditional compilation" which allows one to include code that both optimizes for certain hardware and contains generic code that could work on all devices. I hear it's the new rage of how you make code work on multiple hardware and software platforms.

        You mean like the rage of having your code get more and more complex every time a handset comes out?

        • by Desler ( 1608317 )

          You mean like the rage of having your code get more and more complex every time a handset comes out?

          So basically the situation that already exists. All these new devices tend to need new drivers added to the kernel and they usually do some sort of tweaking to the kernel. Seriously, if they can't handle complexity then they shouldn't be the ones maintaining and developing an OS. Such work is just inherently complex.

        • by 0100010001010011 ( 652467 ) on Tuesday January 04, 2011 @03:49PM (#34757270)

          Debian seems to handle it just fine and (based on gcc) they're compiling for 14 different platforms* and 3 different kernels (linux, hurd, freebsd)

          Is it that difficult to setup a similar thing in the app store? "Oh it looks like you're running an ARMv5 and a PowerVR GPU. We'll give you this binary."

          Or, you do what Apple has always done with Fat Binaries. 68k to PPC. PPC to PPC64. PPC* to i386. i386 to x86_64. You could have one single fat binary that supported ppc, ppc64, i386 and x86_64. And it "Just worked". They were literally checkboxes in XCode. How many GPU and CPU solutions are there for the Android? This isn't low level Assembly code, it's compiled Java.

          *alpha amd64 armel hppa hurd-i386 i386 ia64 kfreebsd-amd64 kfreebsd-i386 m68k mips mipsel powerpc s390 sh4 sparc sparc64

          • by MobileTatsu-NJG ( 946591 ) on Tuesday January 04, 2011 @04:02PM (#34757384)

            That depends on the optimization and how that chipset actually handles throwing stuff on the screen. 'Optimize' may not just mean "format the data this certain way and it'll fly through the processors more quickly", it could also mean "use more polygons and lower-res textures because the chipset is better at moving verts around than filling texels". It doesn't matter that it's not low-level if it affects how the whole engine works.

        • by jedidiah ( 1196 )

          No. You only take the complexity hit on the second device.

          The rest rely on the well known and well used methodology for dealing with diverse hardware you've already constructed.

          It's like the 80s and 90s never happened or some such.

          "mobile" doesn't change any of the problems.

    • Can anyone tell me why that AnandTech article from March is evidence that Windows Phone 7 has lapped Android? And why it just happened?

      He's talking about the fact that Windows Phone 7 has advanced GPU acceleration built in from the beginning in several facets of the environment. Android doesn't and it's a few years older.

      It certainly hasn't lapped it in terms of sales or momentum, but it could. Android has both the advantage and the curse of being more open and versatile - WP7 has the possibility of a more restrained set of hardware differences and can build in more low-level functionality.

    • by iluvcapra ( 782887 ) on Tuesday January 04, 2011 @03:21PM (#34756978)
      I don't know enough about the Android graphics API, but if it's designed properly it should be possible for the client to always call the same function and the underlying implementation to select the code path most optimized for the platform. Mac OS X has one CoreGraphics API, and it either composites on the GPU or on the CPU depending on what's available. I don't see why Amdroiid can't do the same.
      • by TheRaven64 ( 641858 ) on Tuesday January 04, 2011 @04:02PM (#34757392) Journal
        Add to that, you really don't need to optimise the code that you use for compositing on the GPU. A low-end mobile GPU can render something on the order of three million textured triangles per second. A typical mobile UI has a few dozen UI elements, with one texture and two triangles (one square) each. Even really crappy code is not going to come close to taxing the GPU. That's why we do compositing on the GPU - because it can run in its lowest-clocked mode and still provide fast performance, which lets you use the CPU for something else (or down-clock it too and save battery life).
        • Ahem...

          Add to that this... Most SoC's run a fairly narrow and slow memory bus. Also, GPU's tend to be WAY slower than CPUs...

          Fancy racing a 4 * 150Mhz pipe GPU against a 1 Ghz, superscalar CPU with 64/128 bit SIMD extensions?

          Who will win?

          Answer... the memory bus. You can TRY and get the CPU and GPU to work together but all that will happen is that the memory bus will get swamped and everything slows down.

          GPUs can render polys with straight edges. UIs frequently want curved, rounded objects with complex grad

    • by bhcompy ( 1877290 ) on Tuesday January 04, 2011 @03:35PM (#34757132)

      When you optimize to GPUs, you have to optimize to all GPUs. I realize there are common instruction sets but the main selling point of Android is its versatility. If I start coding for only Snapdragon processors with PowerVR GPUs because it has a better UX, then it sort of destroys any benefit I get from Android and I might as well code for iOS because I know what that hardware will always be. A lot of the benefits of Android applications being Java byte code completely independent of the hardware are overlooked in this proposition.

      You mean it's like a real, honest to goodness, computer operating system? Oh no! The horror! Guess you should stop making software for Windows, Linux, and OSX then, since the hardware can provided different capabilities for systems using the same operating system!

      • by bonch ( 38532 )

        You mean desktop computers, those complicated machines that mobile devices and tablets are replacing? People are trying to get away from managing device drivers and hardware compatibility bugs.

        • People are trying to get away from managing device drivers and hardware compatibility bugs.

          Maybe I'm just weird, but that's not why my desktop and laptop computers have been getting lonely lately. It's because my mobile device is, you know, mobile.

    • You don't need to optimize for all of them. Go ahead and pick some of the more popular ones that already exist and optimize for those. Manufacturers are still free to choose different hardware or write their own code, which they are perfectly free to do being that Android is open.

      Having a few 'better' options isn't going to be worse than the lowest common denominator crap that's going on now.
    • by TeknoHog ( 164938 ) on Tuesday January 04, 2011 @03:43PM (#34757216) Homepage Journal
      Somebody should really invent a programming interface for graphics. You could use hardware or software rendering for the same code, or generally a mixture of both, depending on the capabilities. It could be called "open graphics library" or something.
    • Have we not learned anything from the desktop ? In 2011, as an app developer, you shouldn't be optimizing for any specific GPU. The platform's graphics API should be optimizing for whatever hardware it supports. New device ? New drivers! If Samsung's PowerVR implementation makes such a big difference, then we will see other hardware designs adopt the PowerVR. You don't need to reinvent the wheel every single time. Media-minded people will favour devices with faster GPUs, just like we do with PCs.

      I'm

    • by JamesP ( 688957 )

      When you optimize to GPUs, you have to optimize to all GPUs. I realize there are common instruction sets but the main selling point of Android is its versatility.

      Yes

      That's why Apple uses LLVM to compile from 'generic GPU code' to 'GPU code optimized to Blah', that's on Mac OS and maybe on the iPhone as well

      http://llvm.org/Users.html [llvm.org]

    • by anethema ( 99553 )
      Actually Android does not use Java bytecode at all.

      Java is the language used to program for Android, but the code compiles into Dalvik bytecode. There is no J2ME support at all, or any Java virtual machine on the device.

      http://en.wikipedia.org/wiki/Android_(operating_system)#Features

      Look at Java Support.
  • OS X uses llvm just-in-time compilation in the graphics stack. If the hardware supports it, it's sent to the GPU. Otherwise, it's done in software. Since android is based on dalvik, they shouldn't have a problem doing something similar. Sure, they need to support cheap pieces of shit, but that doesn't mean they can't support anything else.
    • by h4rr4r ( 612664 )

      Which means you now have to support how many cards? Are the vendors going to provide drivers that can even go into the android project?

      • by Entrope ( 68843 )

        Cell phones are not known for having many kinds of graphic cards available. Heck, they're not even known for using many kinds of discrete graphics chips. I will even go so far as to say that they draw from a rather narrow set of embedded graphics cores.

        PowerVR is pretty much the market leader, ARM is playing catch-up with Mali, and then you get the long tail. GPU core and SoC vendors know how to work together to deliver usable libraries for chip buyers -- witness the OpenGL ES acceleration that is availa

        • by h4rr4r ( 612664 )

          Usable for chip buyers, which means probably not generally available. Thus meaning as google is not a chip buyer what are the odds they will get gpl, bsd or apache licensed code?

          • by Entrope ( 68843 )

            Google gets GPL-, BSD- and Apache-licensed code all the time. If you meant to ask whether Google will get device driver code that falls under those licenses for the various cores and system-on-chip platforms that are out there, the answer is that Google doesn't need that. Google needs an API like OpenGL ES for Android to use. The component vendors typically make sure that OpenGL ES drivers for Linux exist in distributable form.

            • by h4rr4r ( 612664 )

              Which means kernel updates become a huge pain. I buy desktops and laptops with open drivers for a reason, I want a GPLed drivers on my phone too.

  • by mcrbids ( 148650 ) on Tuesday January 04, 2011 @03:43PM (#34757210) Journal

    OMG Android is making a play that's designed to let lower cost, highly capable devices subsist in the marketplace? How horrible is that?

    I switched from Evil Major Network (TM) to Metro PCS a little over a year ago, and haven't regretted it for a SECOND. It is so nice, getting what you paid for, rather than wondering how much you'll be overcharged for what you aren't even sure you got... it's the ONLY way to survive teen children!

    And even Metro PCS, the low price leader, offers a couple of Android phones that are highly capable and useful. For less than $300 I was able to upgrade my wife's shatty phone with a nice, capable Android phone with GPS, navigation, browser, email, games, full-screen youtube, Facebook, Marketplace et al (AKA "the works") and a good, full day of battery life. She LOVES the phone! In case you are wondering, it was the Samsung LG Optimus. And the network cost went from $40/month to (gasp!) $50...

    Talk about having your cake and eating it too?

    Say what you want, Android's strategy is working, as demonstrated by its continuing skyrocketing market share.

    • Re: (Score:2, Insightful)

      by BitZtream ( 692029 )

      OMG Android is making a play that's designed to let lower cost, highly capable devices subsist in the marketplace? How horrible is that?

      Pretty bad actually. Considering we've been using APIs that allow hardware acceleration with fallback to software implementations like OpenGL to handle JUST THIS EXACT PROBLEM for the last 25-30 years, I'd say it was absolutely shitty for Google to fuck up this badly.

      Of course, perhaps thats cause I understand the problem as stated where as you're comparing service provide

      • The iPhone had flaws, such as the lack of true multitasking, which were/are being fixed. Android also has flaws, which were/are being fixed. Yes, the Android ecosystem has a disadvantage in that it is subject to fragmentation, and the iOS ecosystem has a disadvantage in that Apple tries to exert too much control over developers. I expected Android to catch up with iOS much more quickly than it did; that was my mistake. By most reports, Froyo is the first Android release that is really ready for prime time -
    • by bonch ( 38532 )

      OMG Android is making a play that's designed to let lower cost, highly capable devices subsist in the marketplace? How horrible is that?

      Horrible enough to hurt the user experience, unless you think it's okay for software-based compositing to inefficiently drain your battery or for scrolling to be choppy years after the iPhone 1 on less powerful hardware was smooth and responsive. If you don't think it's important for animated user elements on a touchscreen to not pause, then you're just being a fanboy. If A

  • Feeding the what? (Score:4, Informative)

    by Big_Mamma ( 663104 ) on Tuesday January 04, 2011 @03:49PM (#34757280)

    Let me tell you one thing about that: Java isn't the problem. In my definition of feeding the GPU: triangles/sec, fillrate and OpenGLES objects/sec, Java is just 10% behind a raw C benchmark like glbenchmark 1.1/2.0. They quoted 880kT/s, I managed 750kT/s in non native code. And to get that, you have to carefully feed the GPU with the right batch sizes, don't issue too many state changes, pack things interleaved in the video buffer, don't use dynamic point lights, etc etc. It isn't as bad as an NDS, but the Snapdragon GPU is quite hard to tame.

    The problem with using the GPU is that every context switch requires a complete reinitialization of the GL context, even on a PC, alt tabbing into and from fullscreen games takes ages - it's fine when specific applications which requires the speed use it directly, but it's not when going from one activity to another gives you a loading screen.

    Animation performance and touch responsiveness? Is that the best he can come up with for such a title? I have no idea what he's talking about, but scrolling the browser works just fine here on a not-so-recent HTC Desire. The only time things break down is when the garbage collector halts everything for a third of a second (see DDMS/logcat messages), and those pauses are reduced to sub 5ms in the new builds. That's tons more useful than rendering surfaces to quads and using OpenGL ES to draw them, and IMO, the Android team made the right decision.

    • by dgatwood ( 11270 )

      The problem with using the GPU is that every context switch requires a complete reinitialization of the GL context, even on a PC, alt tabbing into and from fullscreen games takes ages - it's fine when specific applications which requires the speed use it directly, but it's not when going from one activity to another gives you a loading screen.

      Odd, I've never noticed such problems in Mac OS X, even though every window in the OS is composited by the GPU. Maybe it's not so much that GL is inherently flawed s

    • On my Android G1 (which can only run Android 1.6), occasional long pauses before responding to touch input IS a problem. But I assumed that was fixed on newer phones with faster CPUs running Froyo or Gingerbread. I haven't seen any problems on the Color Nook, which is really just a cheap Android tablet device. Anybody have experience with the newest Android releases that can say whether or not these problems are already fixed?
  • It seems like something like Meego (Linux+GL+Qt) would be the best way to go, if you are not an Apple device.

    I never understood why anyone would want to interpret byte-code on a battery powered device. Or give up control of garbage collection. Maybe the VM enforces things like local file system access, but a few lines in the kernel can enforce that too.

  • Symbian^3 also needs a GPU and has very good power management and doesn't support older hardware ... oh, but wait, this is Slashdot....Symbian bad and old fashioned and hopeless...Android good.....

  • Well, as expected, Android is targeted at average mass consumers, so average battery life is acceptable. What gives?
  • and cite more powerful hardware and an avoidance of temporary objects as a solution to the collection pauses

    LOL.

    Java is pretty much only GC language I'm aware of where temp objects are passed to GC. Perl (and I'm sure myriad of other GC languages) at compile time takes note what objects are not used outside of the context and destroys them immediately. IIRC Java is the only language where they blankly send all stuff to GC, regardless. Obviously that that in long term hurts latencies: GC has to recycle them eventually and if there is no spare CPU/core, then it has to take the time from other threads.

    • Java is pretty much only GC language I'm aware of where temp objects are passed to GC. Perl (and I'm sure myriad of other GC languages) at compile time takes note what objects are not used outside of the context and destroys them immediately. IIRC Java is the only language where they blankly send all stuff to GC, regardless. Obviously that that in long term hurts latencies: GC has to recycle them eventually and if there is no spare CPU/core, then it has to take the time from other threads.

      You're forgetting InterLisp (which probably is best forgotten). When Xerox released the InterLisp D machines, even files were garbage collected, meaning that if you deleted a large file, the machine went into a 10 minute garbage collection cycle, effectively blocking it from doing anything else. I'm sure there are other languages that use poorly-designed GC as well. I've never really liked garbage collection, but the alternative is to use IBM MVS like static preallocation of a "dataset" fpr everything. This

      • [...] but memory is dirt cheap these days.

        On embedded platforms there are no cheap resources.

        I once was asked, as a software developer, by my friend how it could be that iPhone with its measly CPU/GPU has smoother UI animation than the monster of a phone which is the HTC Desire HD. And I have reminded him that long long time ago, in ages when 640K of RAM was a lot and high-end CPUs were 25MHz, animation in games was too smooth. It does depend more on the software developer's skills than on hardware. Apple takes time to polish it to perfection -

    • by voidptr ( 609 )

      This isn't true with modern JVMs and JIT compilers. Almost all of the modern JVMs perform escape analysis for hot code to determine if an object can become visible outside the local scope, and using stack allocation instead of heap if appropriate.

      • Any pointers to the Java specific information?

        We did recently long term latency-centric tests and we clearly have seen that Sun's Java 6 doesn't do it. The stateless network server literally halts for 500ms/more every time GC runs - while the server is supposed to run with latencies >100ms. Memory allocation is also OK as (1) there is no global state/data/etc and (2) the work of the server is to essentially dump to disk some of the incoming traffic and send a response back.

        • by JackDW ( 904211 )
          Sounds like you need a real-time JVM. These do garbage collection continuously, without creating large latency spikes. You could look at the Jamaica VM from Aicas - it has real-time garbage collection and should be a more reliable platform for your server application.
  • no android based system will ever, ever be as great as the iPhone. clearly it is foolish to even dream of such a thing.

    what is becoming more and more obvious with each passing month is that nobody cares. Android is outselling everything else by an ever increasing margin.

  • by hey ( 83763 )

    It sounds like the Google engineer is taking the sane approach. He is trying stuff and testing the speed. Sounds like he'd try GPU if it helped.

  • So some random person on the Internet who doesn't appear to have much to do with Android points out a couple of not-really-problems, and suddenly everyone is supposed to drop everything and fix them?

    If you search for "charles ying android" every link comes back with a reference to this single blog post. I could take him seriously if I'd ever heard of him in the context of Android development, or even at all...

  • Java bytecode? (Score:4, Informative)

    by DragonWriter ( 970822 ) on Tuesday January 04, 2011 @06:49PM (#34759480)

    Additionally, Android runs Java bytecode during animation

    Except in the real world, where Android uses a non-Java VM with its own bytecode, and doesn't run Java bytecode at all.

  • by SoftwareArtist ( 1472499 ) on Tuesday January 04, 2011 @07:01PM (#34759606)
    There's a myth going around that battery life is strongly affected by how efficient your code is. On most phone, it's simply not true. By far the biggest power drains are the screen and the radios (cellular, wifi, bluetooth). On Android, there's even a handy battery monitor built in that you can use to confirm this (Settings->About phone->Battery use). I can spend half an hour playing a high end, graphics intensive game (Hero of Sparta) on my Nexus One, and when I then check the battery use, I find that even while I was playing the screen and the cellular radio standby were still the dominant uses of power.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...