HTML 5 Audio playlists on iOS and Android

Posted: October 10, 2012 by Ash Mishra in Html 5, Mobile, Programming

Here is working code tested on iOS 5.x+ and Android 2.3+, using the HTML 5 Audio tag and jQuery, to play two mp3 tracks back to back.

For both Android and iOS, autoplay works (wifi on iOS 6 – not tested with 3G)

On iOS, there is even an extra: audio tracks can continue to be played with Safari in the background, and even with the screen off. Previous, Next, Play, Pause controls work too!

On Android, if the screen is off or the web page isn’t visible, the audio unfortunately won’t play the next track. Returning to the browser will start it off again though.

<!DOCTYPE html>

<meta name = "viewport" content = "width = device-width">
<script src=""></script>

The built-in HTML 5 audio player is shown for debug purposes. You should control the audio with the buttons below.
<audio autoplay controls="controls" id="player" x-webkit-airplay="allow" style="opacity: 1; top: 0px; left: 0px; z-index: 1; margin-bottom: 10px; display: block; width: 250px;">
 <source id="audio_source" src="audio/starbuck.mp3" type="audio/mpeg" />
<button onClick="playNow();">Play</button>
<button onClick="pause();">Pause</button>
<button onClick="playNext();">Next</button>
<script type="text/javascript" language="javascript">
 $(function() {
 $('#player').bind('ended', playerEnded);
 $('#player').bind('canplay', playerLoaded);

 function playerEnded(e) {
 // window.alert('player ended');
 if(!e) { e = window.event; }

 function playNow() {

 function pause() {

 function playNext() {
 if (window.HTMLAudioElement) {
 // window.alert('playing next');
 $('#audio_source').attr("src", "audio/guacamole.mp3").detach().appendTo($('#player'));

 function playerLoaded() {
 // window.alert('player loaded');


Facebook and disclosure of your private information

Posted: October 3, 2012 by Ash Mishra in Uncategorized

This afternoon several colleagues and I have been looking at sites using Facebook to create accounts.  The idea is that if you’re already logged into Facebook, it is easy for us to use Facebook as a conduit to create an account for you on our website; reducing the number of usernames and passwords you need to remember, as well as lowering the barrier of entry in getting you to signup to a website.

You’ve probably seen this as “Register with Facebook” or “Login with Facebook” on many sites.  Below is a screenshot from Rdio, where you can see the Facebook Register button.


Clicking Register on brings up the following page:

Rdio's create an account using Facebook

So if you have a facebook account, you can click on Log in to prefill the form below with your profile information.

Clicking on this link brings up this:

Facebook login panel for Rdio

Now this gets a bit techie, but stick with me.

The url being linked to from Rdio is the following:*%22%2C%22type%22%3A%22checkbox%22%7D%5D&height=448&locale=en_US&onvalidate=facebookValidate&

What this URL is requesting from Facebook is the following:

  1. Your full name
  2. Your email address
  3. Your gender
  4. Your birthday
  5. Your location

That’s alot of information, and we would hope that there is some way to turn off what can be sent back to websites.  Facebook permits you to turn off what data is passed through your friends (in Privacy settings, below), but I have not been able to find a way to change what can be directly requested by a website.

Facebook's privacy settings

The result is, that without controls to change what Facebook can share with other websites, it will return your private data with any website that asks for it.  Image below of what get’s returned to Rdio (fake data supplied via my Facebook account).

Update:  After reading a blog post by Facebook on registration, the panel below is actually not hosted on Rdio, it’s an iFrame (a partial window) on the Facebook domain.  Only when a user clicks Register does your private information get sent from Facebook to the requesting website.

Facebook details returned to Rdio

The Create your Rdio account panel  – before clicking “Log in to Prefill…” – doesn’t explicitly say what it will request from Facebook.  If you read the Privacy policy very carefully, it’s in there, but please – who wades through mounds of legalese?  Honestly, I think its about time that sites do the right thing and completely disclose what data they are going to get from Facebook.  And Facebook should do the right thing and implement privacy controls – by default opting out user’s details, and allowing them to opt-in and choose what to share.

Rdio is a legit site, and I love it.  But there are many sites out there that we visit, and any number of those could have lesser security or the right intentions.   Any of these could ask you to sign up with your Facebook account – and what you believe is private in your Facebook account – isn’t.

Sensor Fusion Redux

Posted: January 2, 2012 by Mike Ilich in Design, Programming

In the previous article, we discussed the polling of individual gyroscope axis values for driving the orientation of an object model in virtual 3D space. As a review, the basic structure for developing a sensor-based UI for 3D navigation is entirely driven by the Core Motion library in iOS, for which the CMMotionManager class is instantiated as a direct interface to the onboard accelerometer and gyroscope hardware in iPhone, iTouch and iPad devices (gyroscope available as of iPhone 4, iTouch 4G and iPad 1G). Sample code to initialize the CMMotionManager is provided below:

   (CMTestViewController.h, in the private interface variable declaration area)
      CMMotionManager *motionManager;
   (CMTestViewController.m, in viewDidLoad or similar setup)
      motionManager = [[CMMotionManager alloc] init];
      motionManager.accelerometerUpdateInterval = 0.01;
      motionManager.deviceMotionUpdateInterval = 0.01;
      [motionManager startDeviceMotionUpdates];

At this point, the motionManager variable can be polled at any point in your code to retrieve a CMDeviceMotion structure, describing the physical motion of the device in discrete intervals defined by the update interval property described above. The accelerometerUpdateInterval and deviceMotionUpdateInterval represent the interval (in seconds) for providing accelerometer and device-motion updates to the block handler, respectively. The six degrees of physical motion can be described by accelerometer values for each independent axis (X, Y, Z) as well as the overall attitude of the device, derived from the embedded “attitude” class variable that can be retrieved through:


This variable contains properties such as “.yaw”, “.pitch” and “.roll”, which we previously used to describe rotation of the device along its Z, X and Y axes respectively. In addition, “attitude” provides a property named “rotationMatrix” that provides a full 3×3 matrix describing the Euler rotations that are required to translate a rigid body in 3D Euclidian space. The Euler angles represent a series of aggregate rotations, each around a single axis, that define the orientation of a point of reference in a three dimensional coordinate system. Therefore, the rotation matrix derived from the gyroscope attitude can be described as the product of three elemental rotation matrices, each representing a single axis. As mentioned in the previous post, the order in which these rotations are applied can significantly affect the overall translation of a point in 3D space, consisting of yaw, pitch and roll.

In addition, these operations result in the introduction of Gimbal lock, or the loss of one degree of freedom when two out of the three rotation axes in 3D Euclidian space are oriented in parallel and the system becomes locked to rotation in 2D space. We observed this phenomenon as either the gyroscope X or Y axis was rotated in parallel with the Z axis, effectively locking rotation of the remaining axis. In order to counteract this effect, a static fourth axis is added by embedding the attitude.rotationMatrix in a 4×4 Matrix with an identity vector for the fourth dimension. Note that a model view matrix is defined for the object model in 3D space and subsequently multiplied by the rotation matrix to complete the transformation. The model view matrix is distinct from the projection matrix, which defines the camera angle, aspect ratio and near/far values for Z offset.

Here is an example of how this might be arranged in the GLKit library, exclusive to iOS 5.0:

      CMDeviceMotion *deviceMotion = motionManager.deviceMotion; 
      CMAttitude *attitude = deviceMotion.attitude;
      rotation = attitude.rotationMatrix;
      GLKMatrix4 rotationMatrix = GLKMatrix4Make(rotation.m11, rotation.m21, rotation.m31, 0, rotation.m12, rotation.m22, rotation.m32, 0, rotation.m13, rotation.m23, rotation.m33, 0, 0, 0, 0, 1);
      modelViewMatrix = GLKMatrix4Multiply(modelViewMatrix, rotationMatrix);

From this point, the modelViewMatrix can be scaled, rotated by a gravity correction factor or applied directly to the transform property of either an object model or skybox effect, inherent to the GLKit API that was included in the iOS 5.0 release to simplify OpenGL ES by providing a usable template and higher level function calls for common 3D modeling elements. By replacing the aggregate yaw, pitch and roll rotation operations with a full rotation matrix accounting for the Euler angles, we have resolved any further ambiguity in model view orientation as well as the incidence of Gimbal lock when taking into account the azimuth in 3D space.

Mini Review: iTunes Match for Canada is here, and it’s good

Posted: December 16, 2011 by Ash Mishra in News, Review

Christmas arrived early for me this year, and it came in the form of iTunes Match for Canada.  There has been a lot of speculation about when the rest of the non-US world would get it, and I’m really glad it’s here.

So what is it, and why would you get it?  My iTunes music library is over 10,000 tracks, and I’ve bought into the Apple ecosystem for about several years now first because of the iPod and now the iPhone.  What iTunes Match primarily does is take the master library from your computer and put it in the “cloud”, where you can then retrieve all your playlists, tracks, and associated personal data such as rankings and classifications.

For the longest time, users who had more than one computer, have had to jump through loops to keep their iTunes libraries in sync (not without hacks that sometimes break).  With iTunes Match, that big problem is solved – since your master music library is no longer on any computer – it’s in the cloud.

The way it works is like this:  after the initial loading of your music library into the cloud, every device (whether it be a computer or  iPad, or iPhone) that has iTunes Match turned on – get’s a copy of the library on the device.  And each device copy is kept in sync automatically with the master.  Make a change on one – like creating or editing a playlist – and that change appears on all the devices.  Now you may be wondering about the size of your library being transferred to a smaller device.  Not to worry, as only the information about your library is synced; music tracks are only downloaded from the cloud when you want to listen to them.

Apple does one very smart thing with their cloud music service that Google Music and Amazon’s Cloud Drive don’t – providing a efficient and quick way to get your Music Library into the cloud.  Google Music and Amazon Cloud Drive require you to upload all  your music into their services.  This can be very very time-consuming if you have a large library of files, and takes up alot of data bandwidth.

The way iTunes Match sets up your library in the cloud, is by examining your library, and matching your audio tracks with all of Apple’s music.  If it finds a match, it doesn’t upload the track from your computer, it can just serve the track directly.  Even better, all the matched music is served in DRM-free, 256 kbps AAC quality.  Out of my library of over 10,000 tracks, iTunes Match correctly identified over 9,000.  The remaining tracks it couldn’t match were uploaded to iTunes Match. This resulted in me getting my iTunes library into the cloud really fast.

Apple’s done one other smart thing with their implementation of iTunes Match on devices:  automatically managing your music library on devices that don’t have enough space.  When you play tracks on a device – like for example a 16 gb iPod touch, but your Music Library is 80gb, iTunes Match lets you download the tracks you want to listen to, and as you get close to running out of space, it will automatically remove music from the device that you’ve least listened to.  Don’t worry, any music it removes is still in the cloud, and simply selecting a track to start playing it, re-downloads it to your device.


+ Very fast and efficient matching of tracks, even with large libraries

+ A single library in the cloud – synced across all your devices – computers and mobile devices

+ Smart space management means you never have to worry about running out of space again

+ Music that you play is delivered in very high quality AAC 256 kbps format

Could be better

– Uploading of tracks not matched is constrained by the demand on Apple’s servers and your own bandwidth.  My uploading is pretty slow.

Who’s it for?

Anyone who has more than one computer and one iOS device, or wants to upgrade their music library to a higher quality encoding

2011 Macbook Air for iOS Developers

Posted: July 28, 2011 by Ash Mishra in Productivity, Programming, Review
Tags: ,
New MacBook Airs are here!

The reviews have started pouring in for the updated Core i5/i7 Macbook Airs released in July 2011, and they have been unanimously acclaiming the ultra-portable laptop / notebook. There’s no question they are a leap in performance from the prior models, and they are now powerful enough to be considered tools for more than the casual user.  I even think they are good enough to be considered by media professionals and developers.

I decided to pick up the 13″ Macbook Air to see if it could replace or be a secondary machine in use for iOS development.  I originally ordered the  top i7/256 model, but returned it in favour of the more affordable i5/128 which in my experience (so far) has been more than sufficient.  I have a 2007 17″ Macbook Pro (2.4 c2d / 4gb / 500gb HD / 1920 x 1200) that I have been using for my primary development. I have enjoyed using the 17″ screen for iPad development – particularly being able to use the simulator at full scale, while having XCode on the same screen.

Read the rest of this entry »

iOS Sensor Fusion

Posted: July 10, 2011 by Mike Ilich in Design, Programming

One might suggest that we’re approaching the golden age of Human Factors design with the introduction of peripheral sensory input that has been inherent in many smartphones for a number of years now. While multi-touch LCD screens have eliminated the need for a separate keyboard or pointing device, paving the way for modern tablets, screen real estate became the limiting factor with on-screen keyboards and the dreaded dual-thumbstick overlay for navigating 3D graphic environments. Built-in sensors such as accelerometers, gyroscopes and digital compasses have only recently seen extensive use as part of the overall UX design, offering physical motion and device orientation as viable UI controls.

Read the rest of this entry »

VBO (Vertex Buffer Object) Limitations on iOS Devices

Posted: July 10, 2011 by Kearwood "Kip" Gilbert in Programming
Tags: , , ,

With the iPhone 3GS, the iOS platform gained programmable shader support through the PowerVR SGX 535.  Accompanied with an increase from 2 to 8 texture units, artists and developers have limitless creative expression by using any material in their 3d scenes that they can describe in GLSL.  The unsung hero of these upgraded GPU’s are their “real” Vertex Buffer Objects (VBO).

Vertex Buffer Objects allow the vertex level data to be moved from slower main memory to memory that can be accessed more rapidly by the GPU.  Instead of making a separate GL call to render each polygon, the VBO’s can be initialized when a model is loaded and a single GL call instructs the GPU to iterate through the vertices and rendering polygons is performed by the GPU.  Even though the new IOS platforms have a UMM (Unified Memory Model) where the CPU and GPU share the same ram chips and bus, utilizing VBO’s frees up the ARM CPU core during rendering for other game pipeline tasks such as physics calculation, AI, and audio.

Read the rest of this entry »