Finally audio based commands and tagging that doesn’t suck

QR Codes

I have kept an eye on QR codes for a few years now; it is a simple technology. Simple technologies win because – well, they just plain work. So many new technologies do a better job of adding complexities to solve a problem than they do to decrease them. I mean, as an example, the majority of the world still uses headphone jacks and earbuds to listen to their iPod, even though blue tooth is a great technology that removes the need to use those easily tangled cords. BUT those annoying tangled cords are still far more reliable and simpler to use than bluetooth. So we wait for the “simpler” technology to become – simpler. QR codes have the same M.O. They are simple and work, but they are also annoyingly primitive. For example, that QR code image you see on the left of this article means absolulty nothing to you visually. Yet I use it to take up space on the page, because it can provide value if you are willing to pull out your phone and take a snapshot of it.

The QR code works becuase it is a unique image that contains data within all its black and wite specks, like a data finger print or a bar code.  A device with an app that can read those specks convert the “fingerprint” into a equally unique URL that the app can then direct you too.

A more complex, yet more direct solution, is to have an image that is human readable, like and ad, act as the data rich finger print. That way a person can either take advantage of the precious realestate of the ad by siimply reading it, or taking a picture of the ad and get directed to the related URL. It looks like technolgies such as Google Goggles are on their way to crakingthat nut. For now however, Google Goggles is not more reliable or simpler to use then our ugly, cryptic; yet simple, and relaibale QR code.

Audio Commands and Tagging

 

The use of audio commands has had a problem finding its place as a “simpler” solution to the everyday problems they always claim to have solved, but as often fall short on expectations. I don’t know how many friends of mine have had voice-command car systems that in the end just dont work as reliably and effectively as turning a dial or pressing a button on their dash.

John’s car: “bee-eep. Can I help you?”

John in his car: “Call Sean”

John’s car: “Looking for Jons and bathrooms in the area.”

John: “Ugghhh! No , Call Sean!”

Car: “Bee-eep. Thank you. Calling Don now…”

John: :-[

Siri seems to be making voice commands better, or at least marketing it that way, but the dream of talking to our computers, as the easier way to interact with them, still seems as far away as it did here (See min 3:00 in 1984) http://www.youtube.com/watch?v=2B-XwPjn9YY

 

 

Okay, Siri and Google voice commands are doing better, and getting used more use than I have ever seen in the past with similar technologies, so that is promising…but yelling into your phone to “search for near by bars” in a crowded room is – well – shitty.

Shazam made some great leaps forward in the audio tagging and command space by finding the unique characteristics in songs, and turning that into pertinent data. That uniqueness is used so that the app can determine the song name, and its singer, just by holding up your phone to a song you hear on the radio.

This year, as some you may have already seen, Shazam has gotten into the QR related space by bringing their technology to TV comercials. When you see the Shazaam logo on a comercial open your Shazam app and let Shazam listen to the commercials unique audio. Their ability to link the unique “fingerprint” of sound coming form the commercials audio, and turn it into useful data allows them to link commercials to open a website on your device; much like a QR code. It is neat because the audio is as easily interpreted by human ears as it is by the Shazam app; maximizing the use of the allotted ad space. Unfortunately it falls short in the fact that it is impractical to expect a viewer to chase down their phone, open the Shazam app, and tag the comecials audio, before the comercial is over.

Audio Sync

 

I think I just saw a technology that actually make sense. Practical in its use, efficient in it implementation, and it solves a problem by decreasing complexity more then it adds.

In this case the audio is used to sync your tablet to a show you are watching. With this strategy you’re truly decreasing the steps needed to get what the show, and the viewer wants. No extra steps, no rushing for a unrelated app to open a web page, no ugly QR code images taking up space, just simply a way to help the user link the app their are using to the show they are watching.

It works by listening to the show you are watching, and applying the Shazaam like technology to the audio of the show to recognize what part of what show you are watching. The data is processed, and instead of just opening a web page, it sync your application’s experience to meta data surrounding the show on you TV. You can then interact with others watching the same moment at the same time, or listen to back stories related to the segment. Cool beans.

 

 

 

Finally audio based commands and tagging that doesn't suck

QR Codes

I have kept an eye on QR codes for a few years now; it is a simple technology. Simple technologies win because – well, they just plain work. So many new technologies do a better job of adding complexities to solve a problem than they do to decrease them. I mean, as an example, the majority of the world still uses headphone jacks and earbuds to listen to their iPod, even though blue tooth is a great technology that removes the need to use those easily tangled cords. BUT those annoying tangled cords are still far more reliable and simpler to use than bluetooth. So we wait for the “simpler” technology to become – simpler. QR codes have the same M.O. They are simple and work, but they are also annoyingly primitive. For example, that QR code image you see on the left of this article means absolulty nothing to you visually. Yet I use it to take up space on the page, because it can provide value if you are willing to pull out your phone and take a snapshot of it.

The QR code works becuase it is a unique image that contains data within all its black and wite specks, like a data finger print or a bar code.  A device with an app that can read those specks convert the “fingerprint” into a equally unique URL that the app can then direct you too.

A more complex, yet more direct solution, is to have an image that is human readable, like and ad, act as the data rich finger print. That way a person can either take advantage of the precious realestate of the ad by siimply reading it, or taking a picture of the ad and get directed to the related URL. It looks like technolgies such as Google Goggles are on their way to crakingthat nut. For now however, Google Goggles is not more reliable or simpler to use then our ugly, cryptic; yet simple, and relaibale QR code.

Audio Commands and Tagging

 

The use of audio commands has had a problem finding its place as a “simpler” solution to the everyday problems they always claim to have solved, but as often fall short on expectations. I don’t know how many friends of mine have had voice-command car systems that in the end just dont work as reliably and effectively as turning a dial or pressing a button on their dash.

John’s car: “bee-eep. Can I help you?”

John in his car: “Call Sean”

John’s car: “Looking for Jons and bathrooms in the area.”

John: “Ugghhh! No , Call Sean!”

Car: “Bee-eep. Thank you. Calling Don now…”

John: :-[

Siri seems to be making voice commands better, or at least marketing it that way, but the dream of talking to our computers, as the easier way to interact with them, still seems as far away as it did here (See min 3:00 in 1984) http://www.youtube.com/watch?v=2B-XwPjn9YY

 

 

Okay, Siri and Google voice commands are doing better, and getting used more use than I have ever seen in the past with similar technologies, so that is promising…but yelling into your phone to “search for near by bars” in a crowded room is – well – shitty.

Shazam made some great leaps forward in the audio tagging and command space by finding the unique characteristics in songs, and turning that into pertinent data. That uniqueness is used so that the app can determine the song name, and its singer, just by holding up your phone to a song you hear on the radio.

This year, as some you may have already seen, Shazam has gotten into the QR related space by bringing their technology to TV comercials. When you see the Shazaam logo on a comercial open your Shazam app and let Shazam listen to the commercials unique audio. Their ability to link the unique “fingerprint” of sound coming form the commercials audio, and turn it into useful data allows them to link commercials to open a website on your device; much like a QR code. It is neat because the audio is as easily interpreted by human ears as it is by the Shazam app; maximizing the use of the allotted ad space. Unfortunately it falls short in the fact that it is impractical to expect a viewer to chase down their phone, open the Shazam app, and tag the comecials audio, before the comercial is over.

Audio Sync

 

I think I just saw a technology that actually make sense. Practical in its use, efficient in it implementation, and it solves a problem by decreasing complexity more then it adds.

In this case the audio is used to sync your tablet to a show you are watching. With this strategy you’re truly decreasing the steps needed to get what the show, and the viewer wants. No extra steps, no rushing for a unrelated app to open a web page, no ugly QR code images taking up space, just simply a way to help the user link the app their are using to the show they are watching.

It works by listening to the show you are watching, and applying the Shazaam like technology to the audio of the show to recognize what part of what show you are watching. The data is processed, and instead of just opening a web page, it sync your application’s experience to meta data surrounding the show on you TV. You can then interact with others watching the same moment at the same time, or listen to back stories related to the segment. Cool beans.

 

 

 

Nanoseconds for dummies

One of our talented engineers Aseem sent this out over email this morning to the group. I really enjoyed it for a few reasons, and figured I would share it as well. First, it is a lecture from the inventor of the compiler; second, it is a lecture from someone in the military; third, she ( Grace Hopper) is very old and I find that inspirational and cute (as offensive as that feeling if mine may be to others – it’s true); lastly, and most importantly, it gives a great visual example about space, time, speed, and badnwidth.

Check-id-ouuuut…..

Use Case: Searching for PMF

Scan.me is great use case for focusing on the right customer, not just the right product.

AND that the product doesn’t *have to* be complicated or new to be wanted.

It just has to be easier to use, and packaged up better then the alternative.

http://techcrunch.com/2012/02/23/scan-gets-1-7m-from-google-ventures-and-shervin-pishevar-to-make-qr-codes-actually-useful/

 

Apple stores put their money where their mouth is

Ahhh the checkout line.

I know what I want, I found what I want in your store, AND I want to pay for it. So what do you do? You make me wait in line to give you my money! Man, that system is so archaic. Talk of “self checkout” has been around for a while, but I have seen very few instances of it in practice. Aside from the self checkout in the grocery store, that is still just a line in the end, checking out while picking up items in the store is not a part of our daily lives – yet.

So with all the rumors of our devices one day helping us checkout in our favorite stores, Apple finally made the decision to take the first step and offer self checkout on your iOS device at all Apple stores. Just download the newest version of the Apple Store App and buy til’ your hearts content, or your bank account runs dry, whichever comes first.

I have yet to use it myself, but am anxious to see how the company that is known for defining best practices around new concepts will implement their self checkout. I am also curious to see how they handle a jam packed store, with hundreds of very valuable items, mostly ranging in the $+1K range.

 

        

7-7 and the cure for hiccups. You will thanks me later.

There is sort of a funny story around this… My friend recently had hiccups that just wouldn’t go away. The funny thing is, this moment shot me back 20 years, reminding me of when I was in 6th grade and thought I had invented the cure for hiccups. I distinctly remember that my cure became popular when a kid who had hiccups in my school would tell them everyone to come talk to me becuase I was the man, or more like the kid, that can help get rid of them.

Fast forward a couple decades, and I am sitting in this situation with my friend. To be honest, I was actually too embarresed to tell her that “I have the cure; try this!” I thought it would sound dorky and cliche. So, doubting the legitimacy of my 6th grade nostalgic memory, I instead turned to the internet.

I found tons of hits on google, and me and my friend proceeded to try each one. From plugging her ears with her fingers, to holding her breath, to rubbing her throat with my fingers, we tried them all, and every time, after a few seconds of silence <hiccup!> – they were back.

So screw it I said, and I told her the story from my child hood, recalling my days as the defacto hiccup shaman of my 6th grade class. I swear to you, after she followed the exercise, the hiccups never returned.

With my new found blogger mentality, I went to my computer and decided to share my cure with the world. Here it is. Although simple, it is effective, and the combination and adherence to the formula is a necessity. Like the line “it was the best of times, it was the worst of times” any variation, no matter how similar, would do it unjustice.

Without further ado, here it is – the 7-7 cure for hiccups :

  1. Fill up a glass of water
  2. Breath in until you can’t breathe in anymore
  3. while your lungs are full, pretend as if you were forcing out air from your lungs but prevent doing so and do it for for 7 seconds. If done right your abs will feel tight and a slight pressure will form in your throat.
  4. (If you hiccup in the middle of this or at any time start over, it is necesary that you do not hiccup while practicing these steps, tell your mind “if I hiccup, I will do it the second i am done, but def not during the exercise.)
  5. Now, while your lungs are full and after you have completed counting the seven seconds take seven large gulps of water. While gulping the water feel the gulps massage your throat. This may cause a burp, but again if it stops you from doing the exercise,  you must start over.
  6. That’s its. Once completed wait and see if it works.

 

To recap:

Full your lngs with air, hold it for seven seconds and then take seven gulps of water.

Steve Jobs in the beginning

Here are a couple videos that show Steve Jobs growing his philosphy, company vision, and product. One comes with a nice narration around his time building Next Computers. It’s a great glimpse into his fundemental beleifs that guided hmthroughthe years. The other is hi giving a lecture of where Apple came from and where it is going.

 

This video narrates Jobs creating Next Computers with the first group of employees at their company retreats.

This video is an early presentation by Jobs from 1980 describing how Apple started, what he sees its affects being, and some interesting insight into what he sees in tech startup potential and the growth of the human race.

3 iOS Tricks That Are Amazingly Unknown

It sucks not being able to search a web page for a specific word on a webpage in your iPhone and iPad.

I – Find on page

iPhone & iPads

  1. Click on google search box
  2. enter the word you want to search
  3. At the bottom of search results for the web, the search results for whether the word appears on the current web page your on is listed.
  4. Scroll down to reveal the words found on the page

Just iPad

iPad recently added a new search web page fiend to the keyboard when eacrhing on the google input box

Have you used four fingers on your iPad lately?

iPad only.

II – Swipe between app

When having more than one app in your background place four fingers on your screen and swipe to the right or left to alternate betwen apps in your background wthout needing to double click or opening and clssing apps

III – Open app in background with swipe

You can access your background apps without doubleclicking your home button on Ipad.

Simply use four fingers on your screen swiping upwards and you will get the listing of all your background apps.

Hope you have fun rediscovering the fun you can have with your iOS devices!

Google TV is finally a google TV

Got home tonight and notcied my google tv wasnt responding to my harmonay iphone remote…..after some fiddling around a simple update ended up being all that was required. To my surprise it wasn’t just a minor release, like it usual ends up beeing. My google TV has changed considerably! (along with my iphone remote,) and the my Google TV finally has the Adnroid market, ergot finally a real Google TV! Yay!

I downloaded some apps, checked out the new user interface and workflow. I know have a 50 inch non-touch tablet 🙂

There is also the “allow unkown sources” option in the settings, along with enable debugging, so I guess developing for my TV is now piled onto my list of things to do.

 

After some googling, I found some info on the update. You can check it out here: http://www.google.com/tv/