Cheap Gas! Version 3.0 Now in iTunes! Augmented Reality is HERE!

Cheap Gas! Version 3.0 Now in iTunes! Augmented Reality is HERE!

Augmented Reality in Upcoming Version of Cheap Gas! – “Cyborg”

Augmented Reality in Upcoming Version of Cheap Gas! – “Cyborg”

Amigo 1.1 – Now in iTunes!

Amigo 1.1 – Now in iTunes!

The next revision of Amigo (1.1) is now in iTunes, here.

This version has been tested with OS 3.0 and can now upload photos to FriendFeed.

Check it out!

Some Upcoming Changes for Amigo

Some Upcoming Changes for Amigo

A few of the changes coming in the next revision of Amigo.

NSDateFormatter and Twitter

NSDateFormatter and Twitter

The NSDateFormatter class is super handy for scanning incoming text dates in order to create a NSDate object, so that you can then use for other date manipulation.

As handy as it is, it is not as well documented as one would hope.

As a bit of a teaser, I’ll list a version of a NSDateFormatter helper function that will take dates from the Twitter API and convert them into NSDate objects. I will flesh out the details behind the function in a post to come Thursday or Friday.

NSDateFormatter Helper:

-(NSDate*)dateFromTwitter:(NSString*)str {
static NSDateFormatter* sTwitter = nil;

if (str == nil) {
NSDate * today = [[[NSDate alloc] init] autorelease];
return today;
}

if (!sTwitter) {
sTwitter = [[NSDateFormatter alloc] init];
[sTwitter setTimeStyle:NSDateFormatterFullStyle];
[sTwitter setFormatterBehavior:NSDateFormatterBehavior10_4];
[sTwitter setDateFormat:@”EEE LLL dd HH:mm:ss Z yyyy”];
}
return [sTwitter dateFromString:str];
}

Geo-Tagging Pics on the iPhone: Easy AND Hard

Geo-Tagging Pics on the iPhone: Easy AND Hard

Well, over the weekend, I made some progress with my woes of getting geo-tagged info from pictures on the iPhone.

I wound up writing my own analog version of the camera roll picker, and it came out really nice – I was able to read pics from the phone’s DCIM directories for pictures and screen grabs.

But more importantly, I was able to read the EXIF information from the photos, which the API picker strips from view.

Happy, I soon found myself to be deluded – because any image I took with the image picker controller’s camera setting also stripped EXIF information when saved to disk.

After much pain (which I’ll detail in a future post) my solution was the following:

1) Write the picture taken with a camera to disk using the Apple API. This allows the API to find out what the next picture should be named (not as easy as just looking for the highest named file – for example, what if there are no pics on the phone? You would think that the number would be IMG_0001.JPG, but the iPhone “knows” – somehow – what the last picture taken actually was. By calling the API, you get the phone to always tell you what the next file name should be.

2) Now that you have written a JPEG using the API (and really, also a Thumbnail 75×75 JPEG as well), delete that image.

3) Write your version of that image, with the appropriate geo tags from the CoreLocation services.

That is the 10,000 foot version of things. In reality, this little feat of mimicking the built in camera controllers turned out to be a royal pain in the ass.

But at the end of the day, I can take pics, store geo tags, and read geo tags from the iPhone in a way that looks just like the documented uiimagepickercontroller interfaces do.

You Can Have It In Any Color – As Long As It’s Black

You Can Have It In Any Color – As Long As It’s Black

I’ve been banging my head up against a wall today over something that should be easy.

What was I trying to do?

Well, I’ve been working this week on a new app for TweetPhoto.com that will allow you to post photos from your iPhone to the TweetPhoto web service.  In fact I wrote about it here.

I thought I had one problem licked – sending geo tag information to the API call.  As it turns out, I had only HALF the problem licked.

Initially, I was using the phone’s CoreLocation services to determine location and upload that to the API.  This is totally cool when you are using the camera.

Not so much when uploading photos taken elsewhere.  Ouch.

OK.  No problem.  I know the phone stores EXIF (exchangeable image file format) information with the JPEGS it takes, so it should be a slam dunk to grab that from th image picker control, right?

Wrong, grasshopper (with props to the recently deceased David Carradine).

The image picker STRIPS all EXIF information from photos passed in from the camera roll, so you, Mr. and / or Mrs. Developer, are screwed.

OK.  I’m a fart smeller.  I should be able to figure this out.

Hey – what if I can find where the camera roll is on the iPhone, enumerate it directly and read the image files from there?

Oh.  Heavenly Days.  That is a GREAT idea.

So, I trot out this gem, feeling like I have the problem close to being solved:

-(void)getCoords:(UIImage *)image lat:(float*)latAddr lon:(float*)lonAddr {

NSDirectoryEnumerator *enumerator  = [[NSFileManager defaultManager] enumeratorAtPath:  @”var/mobile/Media/DCIM/100APPLE”];
NSAutoreleasePool *innerPool = [[NSAutoreleasePool alloc] init];
id curObject;

*latAddr = 0;
*lonAddr = 0;

while ((curObject = [enumerator nextObject])) {
if ([[curObject pathExtension] isEqualToString:@”JPG”]) {

NSData * fileContents = [NSData dataWithContentsOfFile:[NSString stringWithFormat:@”var/mobile/Media/DCIM/100APPLE/%@”, curObject]];
UIImageView * seeMe = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 480)];
seeMe.image = [UIImage imageWithData:fileContents];

// First, we’ll get a JPEG representation of the image
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData:fileContents];
EXFGPSLoc * lat    = [jpegScanner.exifMetaData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
NSString * latRef = [jpegScanner.exifMetaData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitudeRef]];
EXFGPSLoc * lon    = [jpegScanner.exifMetaData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
NSString * lonRef = [jpegScanner.exifMetaData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitudeRef]];

float flat = [[NSString stringWithFormat:@”%f”, lat.degrees.numerator + ((float)lat.minutes.numerator / (float)lat.minutes.denominator) / 60.0] floatValue];
float flon = [[NSString stringWithFormat:@”%f”, lon.degrees.numerator + ((float)lon.minutes.numerator / (float)lon.minutes.denominator) / 60.0] floatValue];

if ([[latRef substringToIndex:1] isEqualToString:@”S”]) {
flat = -flat;
}
if ([[lonRef substringToIndex:1] isEqualToString:@”W”]) {
flon = -flon;
}

// Does the image match???
if (seeMe == image) {
*latAddr = flat;
*lonAddr = flon;
return;
}

[jpegScanner release];
[seeMe release];
}

[innerPool release];
innerPool = [[NSAutoreleasePool alloc] init];
}
[innerPool release];
innerPool = nil;
}

NOW I’m rolling.  This works GREAT.  I can read images, extract EXIF information… feeling good.

Until I realize that I have NO way to associate the files that I am reading directly from what the image picker returns to me.  Did I just say “shit” out loud?  Because that is what I’m swimming in.

The image picker simply hands back an UIImage with some editing information.  That’s it.

OK, OK, OK.  Maybe I can compare NSData elements… or UIImage elements against what I read from disk and what the picker sends back… so far, neither of those approaches is working.

And now I’m sitting here, realizing that I can accurately describe ANY image’s geo tagging information.  I just can’t pick a SPECIFIC image from the bunch, well, at least using Apple’s APIs.

I’m coming to the sad realization that I might have to write my own bastardized version of the image picker, at least for scrolling through the camera roll.

I ain’t skeered to do it.  I just wish I didn’t have to.

But that looks like that’s exactly what I’m gonna have to do.  Damn it.

Posting Photos using Objective-C

Posting Photos using Objective-C

Or more specifically, posting photos to tweetPhoto.com… but the method works for posting any image from an iPhone to any site that can accept POSTed images.

You can find the API documentation for tweetPhoto here.

Since using NSURLConnection and NSMutableRequest are covered numerous places elsewhere, I’m not going to describe much about how to make asynchronous HTTP calls.  The code below pretty much speaks for itself.

The interesting parts of this function (sendPhoto:tweet:photo:tags:longitude:latitude) are the portions involved in composing the HTTP form data to go over the wire.  Again, the code is fairly self explanatory.

The inbound (NSData*)photo should be the binary image of a picture that you will be sending to tweetPhoto (or whatever service you’re posting images to).  Please note that you will need to Change the Content-Type to match whatever images you’ll be sending (in my case, I always send .PNG images).

So, without further ado, here’s the deal-ee-oh:

-(void)sendPhoto:(NSString*)message tweet:(BOOL)tweet photo:(NSData*)photo tags:(NSString*)tags longitude:(float)longitude latitude:(float)latitude {
TweetPhotoAppDelegate* myApp = (TweetPhotoAppDelegate*)[[UIApplication sharedApplication] delegate];

NSString *url;
if (tweet) {
url = [NSString stringWithFormat:@"http://www.tweetphoto.com/uploadandpostapiwithkey.php"];
} else {
url = [NSString stringWithFormat:@"http://www.tweetphoto.com/uploadapiwithkey.php"];
}

NSString * boundary = @"tweetPhotoBoundaryParm";
NSMutableData *postData = [NSMutableData dataWithCapacity:[photo length] + 1024];

NSString * userNameString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"username\"\r\n\r\n%@", myApp.loginString];
NSString * passwordString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"password\"\r\n\r\n%@", myApp.passwordString];
NSString * apiString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"api_key\"\r\n\r\n%@", apiKey];
NSString * messageString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"message\"\r\n\r\n%@", message];
NSString * tagsString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"tags\"\r\n\r\n%@", tags];
NSString * latString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"latitude\"\r\n\r\n%f", latitude];
NSString * longString = [NSString stringWithFormat:@"Content-Disposition: form-data; name=\"longitude\"\r\n\r\n%f", longitude];
NSString * boundaryString = [NSString stringWithFormat:@"\r\n--%@\r\n", boundary];
NSString * boundaryStringFinal = [NSString stringWithFormat:@"\r\n--%@--\r\n", boundary];

[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[userNameString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[passwordString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[apiString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];

if (message != nil && ![message isEqualToString:@""]) {
[postData appendData:[messageString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
}

if (tags != nil && ![tags isEqualToString:@""]) {
[postData appendData:[tagsString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
}

if (longitude && latitude) {
[postData appendData:[latString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[longString dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:[boundaryString dataUsingEncoding:NSUTF8StringEncoding]];
}

[postData appendData:[[NSString stringWithFormat:@"Content-Disposition: form-data; name=\"media\";\r\nfilename=\"media.png\"\r\nContent-Type: image/png\r\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]];
[postData appendData:photo];
[postData appendData:[boundaryStringFinal dataUsingEncoding:NSUTF8StringEncoding]];

NSMutableRequest * theRequest=(NSMutableURLRequest*)[NSMutableURLRequest requestWithURL:[NSURL URLWithString:url] cachePolicy:NSURLRequestUseProtocolCachePolicy timeoutInterval:60.0];

[theRequest setHTTPMethod:@"POST"];

[theRequest addValue:[NSString stringWithFormat:@"multipart/form-data; boundary=%@", boundary] forHTTPHeaderField:@"Content-Type"];
[theRequest addValue:@"www.tweetphoto.com" forHTTPHeaderField:@"Host"];
NSString * dataLength = [NSString stringWithFormat:@"%d", [postData length]];
[theRequest addValue:dataLength forHTTPHeaderField:@"Content-Length"];
[theRequest setHTTPBody:(NSData*)postData];

NSURLConnection * theConnection=[[NSURLConnection alloc] initWithRequest:theRequest delegate:self];

[UIApplication sharedApplication].networkActivityIndicatorVisible = YES;
if (theConnection) {
receivedData=[[NSMutableData data] retain];
}
else {
[myApp addTextToLog:@"Could not connect to the network" withCaption:@"tweetPhoto"];
}
}

Amigo FriendFeed Client iPhone Application – Still Cookin’, But Getting Close

Amigo FriendFeed Client iPhone Application – Still Cookin’, But Getting Close

Work in Progress – Amigo

Work in Progress – Amigo