POWERVR Mobiles
category: offtopic [glöplog]
Be very wary of the emulators - there's a whole world of difference between "emulator" and "full emulation including opengl es 2.0 with accurate speed".
Personally, I don't use the emulator at all any more for anything beyond a standard GUI based app. Too many times the emulator has led to a ton of wasted time because of misleading speed or features (worst case, the emulator supporting APIs that don't even exist on the actual phone, although that's been fixed at least).
That's based on my experience with iOS btw.
Personally, I don't use the emulator at all any more for anything beyond a standard GUI based app. Too many times the emulator has led to a ton of wasted time because of misleading speed or features (worst case, the emulator supporting APIs that don't even exist on the actual phone, although that's been fixed at least).
That's based on my experience with iOS btw.
The Android SDK has a VM for all versions. It is slow though. But You can even debug stuff on the real device if you use an USB cable and install the right ADB driver. Work like a charm here with Eclipse...
Yeah, debug on the device is ideal. It's just unfortunately much slower (on iOS at least), because you have to wait for the app to copy to the device + install. Not a major problem, but pretty annoying if you're doing one of those debug sessions that involve constantly restarting the app.
Does the android emulator do hardware emulation then? And how does that work with the million odd CPU/GPU/RAM combinations there are?
The iOS one just compiles the code for intel and runs it natively (but still within the emulator, using iOS frameworks etc.) It's super fast, but of course WAY faster than real hardware, and with access to way more memory. It's easy to screw up on performance using the emulator. In the past, it would even happily use OS X API calls from within the emulator, but at least that's been fixed!
There's also a bunch of stuff that just won't work in an emulator of course, like multitouch, accelerometer/gyro, camera, and slow/unreliable network links. If any of those matter, avoid the emulator ;)
Does the android emulator do hardware emulation then? And how does that work with the million odd CPU/GPU/RAM combinations there are?
The iOS one just compiles the code for intel and runs it natively (but still within the emulator, using iOS frameworks etc.) It's super fast, but of course WAY faster than real hardware, and with access to way more memory. It's easy to screw up on performance using the emulator. In the past, it would even happily use OS X API calls from within the emulator, but at least that's been fixed!
There's also a bunch of stuff that just won't work in an emulator of course, like multitouch, accelerometer/gyro, camera, and slow/unreliable network links. If any of those matter, avoid the emulator ;)
Android has ADB, a debug bridge which lets you configure different VMs debugging and can also connect to real devices. The normal debug environment is a VM, and thus relatively slow. Debugging on a real device involves copying the application to the device, starting it and connecting the debugger, which takes some time, but after that it is fast and you have all the hardware features. Much better imo.
The VM/emulator seems to do GLES 1.1 at least, dunno about 2.0 though. Got no idea about CPU/RAM and shit. Might be configurable...
Took me 2-3 evening sessions to go from 0 (no Eclipse, Android SDK, ...) to a working GLES application being debugged on my device, which I consider ok.
Btw. there's also the Android NDK to create native (ARM) code in C/C++ for speed. Haven't tried that though...
The VM/emulator seems to do GLES 1.1 at least, dunno about 2.0 though. Got no idea about CPU/RAM and shit. Might be configurable...
Took me 2-3 evening sessions to go from 0 (no Eclipse, Android SDK, ...) to a working GLES application being debugged on my device, which I consider ok.
Btw. there's also the Android NDK to create native (ARM) code in C/C++ for speed. Haven't tried that though...
Sounds pretty much the same as iOS, except maybe they went for accuracy over speed in the emulator where apple went for speed over accuracy.
Quote:
where apple went for speed over accuracy.
Well their GLES2.0 emulation is conformant and slow, same can't be said (in most cases) for other translation layers that just translate to desktop GL.
I tend to develop in Visual Studio with POWERVR VFrame ES2.0 emulation, cross-compile in a VM, then run/debug on the target (ie. Pandaboard or similar) via an NFS mount and SSH session. This allows very fast iteraion.
Weyland Yutani: Doesn't run (framebuffer not complete exception) on Samsung Galaxy S2.
Ah right. I was thinking more of everything else rather than GLES there - I've only touched GLES on the simulator briefly, everything else has been video based and there's no choice but to run + debug on actual hardware.
Everything non-GLES I write gets mostly done in the simulator, and it's most definitely fast but inaccurate ;) I regularly end up re-writing stuff because it's too slow on hardware, or bugs suddenly get exposed that were invisible in the simulator. But that's not TOO often, and the simulator is so fast to work with it still wins out overall.
Everything non-GLES I write gets mostly done in the simulator, and it's most definitely fast but inaccurate ;) I regularly end up re-writing stuff because it's too slow on hardware, or bugs suddenly get exposed that were invisible in the simulator. But that's not TOO often, and the simulator is so fast to work with it still wins out overall.
Weyland Yutani, the thing you have to understand about the Galaxy S2 is that it has an ARM Mali GPU so it's RUBBISH!
You must mean Kusma ... I have a HTC Incredible S (Snapdragon) and it works like a charm, USB ADB debugger is sweet ;)
Just saw something funny and related!
There was some news item about the new imagination sgx 6 series GPUs in my RSS stream in google reader. Including a video of muon baryon! How cool is that, 1. it's powerful enough to run muon baryon, and 2. they're featuring a scene demo to show off their new GPU!
But, alas, no. It seems to be a google reader bug, because that video is getting tagged onto random articles. I'm subscribed to ferris' blog, so that's obviously where it came from. Strange bug though :)
Btw, any suggestions on where I can read up on image noise reduction on the GPU? There's a lot out there, but it's dotted all over, a site with a good selection of papers would seriously help. I'm implementing that on SGX at the moment, it's tough to get good good noise reduction without losing a lot of detail while keeping speed really high.
My first attempt was a nasty branch-heavy median filter that managed 2fps :/ I now have a branchless version that I suspect is fast enough (not yet tested on hardware...) but quality is still an issue.
There was some news item about the new imagination sgx 6 series GPUs in my RSS stream in google reader. Including a video of muon baryon! How cool is that, 1. it's powerful enough to run muon baryon, and 2. they're featuring a scene demo to show off their new GPU!
But, alas, no. It seems to be a google reader bug, because that video is getting tagged onto random articles. I'm subscribed to ferris' blog, so that's obviously where it came from. Strange bug though :)
Btw, any suggestions on where I can read up on image noise reduction on the GPU? There's a lot out there, but it's dotted all over, a site with a good selection of papers would seriously help. I'm implementing that on SGX at the moment, it's tough to get good good noise reduction without losing a lot of detail while keeping speed really high.
My first attempt was a nasty branch-heavy median filter that managed 2fps :/ I now have a branchless version that I suspect is fast enough (not yet tested on hardware...) but quality is still an issue.
Weyland: What set of framebuffer attachment combinations are supported is implementation defined, you probably use one that is supported on Adreno but not on Mali.
pete: Yeah, that's why it's whooping SGX's ass in benchmarks ;)
pete: Yeah, that's why it's whooping SGX's ass in benchmarks ;)
Kusma: I didn't do that port, I would ask people to test it here before I'd add it to the market anyway ;)
Weyland Yutani: Aha, ok. Sorry for blaming you, then :P
So who do we blame? Pete with his wild accusations (or was it a wild beard?) or maali with his rubbish GPUs? :)
www.madewithmarmalade.com is a great SDK. They allow you to write native C++/GLES code and run it on an GLES emulator under windows but you can then deploy it to iOS, Android, Bada, WebOS and Symbian.
Works quite nicely!
Works quite nicely!
Err, none of you would happen to know about a SGX 530 technical reference manual by chance?