Latest Entries »

I have a glorious, big, high-definition TV. Tilt-shift is a cool technique that makes great big cities look like train sets. But I’m already tired of seeing it. “Sherlock” used it to good effect with wide panoramas of London that made it look like a toy set (which was extremely appropriate, considering that Sherlock Holmes considered the city his own personal play-set; clever). But “Continuum” blurs out parts of the picture, making me wonder, “Why do I have this big, beautiful TV?” Please stop undermining my purchase decision. On the other hand, it’s fun to see a police officer from Portland, OR, as the “alien” outsider (even though she’s really from the future and way more of an alien than the local police from Vancouver really know).

Really big solar farms

I saw this article about a “huge” solar farm in Arizona. I’ve been reading a lot of physics non-fiction and sci-fi lately, so my definition of “huge” has been expanded. All of Arizona isn’t exactly “huge” to me right now, and this solar farm would just be a glint if seen from orbit.

Just a thought, but at a certain scale, the mirrors focusing energy can’t just be placed on the surface of the earth and pointed at a tower or other collector. At “enormous” scale, because of the curvature of the earth, the mirrors will actually have to point to something under the normal continuous surface earth, or the light would have to be re-reflected along the surface (the way microwave towers do today). That would really be something to see.

California or bust

Wow. I haven’t posted in a long time. But this is big news: I’m moving with my fiancĂ©e to California in two weeks. Also, I’ll be homeless, but I’m confident that I can find a livable place within a few weeks. Picking a place without your significant other with you can be… challenging. I’ll try to be more prolific after the move. I’ll be working with the Google Plus Platform team, helping G+ spread to the rest of the web.

I’m excited about the opportunity, but it’s bittersweet. I’ve grown accustomed to New York City (although I’m not “in love” with it, like many people who move here). I’ll miss the people I’ve grown closer to, and I’ll also miss those who I’ve grown apart from. If that sounds like a veiled statement, you’re completely right. And if you think that statement is about you, you’re probably right too. Or “you’re so vain.” Either way.

Programming With Cursors is Evil

Cursors seem like a great idea. A cursor, for my non-programming friends, is like running your finger through a book as you’re reading it. It helps you keep your place and navigate a set of data. Imagine you’re making a recipe. It is really nice to keep your place in the recipe while you add some eggs to the mixer, for example. It makes it easy to walk step by step through whatever you’re reading. Of course, most recipes consist of two sets of data to navigate: a list of ingredients and the set of steps required to combine the ingredients to make your delicious treat. Have you ever gotten to the end of a recipe and discovered you skipped an ingredient? This is only one way cursors can mess with your lovingly crafted program.

Cursors become truly evil the same way so many programming metaphors become evil. By allowing you to write where the cursor is. Let’s reverse the metaphor for a second. Suppose you have a recipe that you’re changing based on how the cook actually performs the steps. Using your finger, you move step by step as the cook performs the existing steps in the recipe. When the cook does something different from the step you’re expecting, you make some room and insert the additional things he does. Sounds as easy as using a word processor, doesn’t it?

The problem is that few computer programs are so simple. Suppose now the cook just starts doing the steps in a different order. It’s now a much more complex task to make sure you’re tracking the steps in the recipe. Now think of a computer program doing something similar. Even a reasonably good programmer makes mistakes. They’re called bugs, and every single piece of software I have ever encountered has them. In the real world, let’s say you get distracted while the cook does something differently and you don’t notice. It doesn’t matter for this analogy whether you missed him doing something you expect or if you missed a new step. Now you’re out of sync. There are only two possible outcomes: you become back in sync (in the real world you can talk it out with the cook) or you experience a hopeless cascade of failures. “Did you ever add the cinnamon?”

The evil that cursors represent is when you try to correct the problem, which requires the discovery of that one step that pushes you out of sync. The program that uses the cursor relies on the cursor to hold the state of the data, but the cursor relies on the rest of the program (the logic) always to agree with that state. Because there is no explicit connection between the cursor and the program, if they start to diverge on the perception of reality, it can be extremely difficult to discover where the divergence starts (which, by the way, is the only way to fix a bug like this). A complex recipe has on the order of 50 steps. A computer program has millions. It’s not hard to see the potential impact. So the next time you use a cursor (or anything like it), keep these dangers in mind and program defensively against them. Here are some techniques I used:

  • Enable debugging (logging, really) of your cursor state. When you change, move, insert, delete, make sure there’s a way to make sure the cursor is at least in a valid state. Don’t do the debugging in production, though. Better yet, make a system property or environment variable control whether it’s enabled. If you can track operations on your cursor, consider keeping an expected state handy in debug mode. Enable it at all times inside your IDE. This will help you recognize a problem during your build-test cycle.
  • Make sure each block of code that manipulates the cursor or the underlying data is consistent within itself. Unit testing is a great way to accomplish this in an automated fashion. I’m a fan of unit testing, especially when it can save literally days of debugging.
  • If you end up encountering a synchronization issue, make creative use of your debugger. Conditional breakpoints, evaluating expressions during debug time, and (if your debugger has it) historical playback can be extremely valuable. I set a breakpoint when a known synchronization error occurred. In my case, the error only presented itself after a write was made, so this step alone told me only that the error happened before I hit the breakpoint. It sure was a good start, though. Don’t be afraid to have complex conditionals, either. You’re debugging, so use any horsepower you need. It beats stepping through the whole program line-by-line.

Profit and Purpose

Here is a very interesting video analyzing the use of incentives for tasks that require “more than rudimentary cognitive skill”. The net result is that more pay doesn’t drive more performance, which is an unexpected result.

When watching this video, something occurred to me. It refers to three primary factors that drive better performance (and personal satisfaction) in these cognitive tasks: Autonomy, (Pursuit of) Mastery, and Purpose. This seems pretty obvious to me, now that I’ve heard someone say it. So if pay isn’t for driving better performance, what is it for?

In the case of the cognitive, self-driven individual, I suspect pay is actually compensation for the surrender of one or more of these factors. Consider someone who would “do the job anyway, even for no pay”. These are the kinds of jobs that I’ve always enjoyed having… but I’ve also enjoyed getting paid for them, so I’ve tried to say things like that out of earshot of my employer! Put simply, workers exchange some autonomy (working on the projects they choose), mastery (specializing in exactly whatever arcane part of the subject they are most fascinated by), and/or purpose (choosing to drive toward a goal that the employer chooses) in exchange for a paycheck. So, someone with a high degree of one factor (say, Mastery) might be convinced through payment to apply that mastery to a different goal (to varying degrees; for example, they may be able to devote a smaller portion of their time driving toward their personal purpose) for a large amount of money.

All this makes me ask, within this framework, “what is it to be an entrepreneur?” To be an entrepreneur is to have a high amount of autonomy, mastery, and purpose. You set all these to “high” and hope to incentivize others to give up some of their factors in order to help you achieve your goal. And in exchange, you typically defer any big money (i.e. salary) you might get until someone wants to take over one or more of these factors, either through acquisition or public offering.

It’s an interesting model, and one that I’m sure I will be using from now on.

Android FAILs

It seems ironic to me, being in the software industry for so long, that the phone that is based on the open-source operating system has such Microsoft-feeling ads (the two mentioned here are at the bottom of this post). The latest bomb to drop is the “Stealth” Android commercial. I think they were shooting for the Lexus vibe, but I really think these spots are missing the mark (see 0:32 in the Droid spot, where they literally drop them into the ocean). But the spot that really compelled me to write this post was the highly-stylish the Sony-Ericsson Xperia ad. This ad drops so many name brands that for a second, I thought Chanel came out with a new phone. I fell under the Sony-Ericsson spell a few years ago and bought a beautiful phone with a big, bright screen (unusual at the time). Although I enjoyed the phone and browser parts, I found the media parts completely useless. Without illegally downloading content, there was no way to get anything useful on the phone. I found this really frustrating, because the software was actually kind of decent. Speaking of software, mobile applications will undoubtedly be ported to ever piece of hardware that can support them, and if the mobile gaming industry ends up driving the application content, they have tons of experience porting to different platforms. Although perhaps it would be nice if Android could showcase even one application. Of course, which hardware builder and/or mobile network is going to showcase an application that isn’t exclusive to them? I’m not sure. My real problem with Android phones is that there is no clear solution for distribution of content. I’ve found that audio/video compatibility becomes an unbearable burden with anyone but Apple. Or maybe Microsoft, since at least then I know I’ll be locked into Windows Media. Even then, I wouldn’t know how to legitimately get content onto a Windows Mobile phone. It’s pretty obvious with an iPhone.

Full disclosure: I’m a happy iPhone user. Ever since my first experience receiving a call while my iPhone was playing music through my car stereo, when the music faded out like a Hollywood movie, I’ve been in love. The experience has been closer to perfect than I ever imagined (note: I did not say perfect; it’s technology, and it has its problems). I would like the iPhone to have a better camera, and I’m sure Apple will produce one way before my contract is up, putting me in the awkward position of deciding whether to spend $500 on a phone or to wait another year or so to upgrade. That’s okay with me. I use my phone a lot to check email, hop on the Web, text message, and yes, I even make and receive calls from time to time. I’ve found that I’m not unusual among my friends and, surprising to me, my professional acquaintances. Text messaging has taken over. Voicemails are no longer left, because they are no longer answered.

So, Android phone builders, how about showing some commercials where people are actually doing something with these hot new pieces of hardware, besides dancing or quivering in fear? There will be people who will buy style over substance, and you’re selling the “sizzle”. But unless there is some “steak” to go along with that sizzle, your new customers will not turn into raving fans, like I have. I recommend iPhone to literally anyone who asks (or even to people I overhear talking about getting a new phone). Who’s going to do that with your phone? Any why will they do it?

Planet Express: Evil Universe-Destroying Conspiracy?

Planet Express: Evil Universe-Destroying Conspiracy?

If you’re reading my blog and you haven’t seen Futurama, watch an episode. For anyone who hasn’t seen it, here’s a brief synopsis: Our hero, Phillip J. Fry, pizza delivery guy, is delivering a pizza on New Year’s Eve 1999 when he inadvertently gets frozen for 1,000 years. He awakens in the year 3,000 and ends up working for his great-great-great-…-nephew, Prof. Farnsworth, who is over 100 years old and never looked better. Fry’s new job is delivering packages for Planet Express, a package delivery company Prof. Farnsworth put together with his faster-than-light spacecraft.

In one particularly suspenseful episode, Prof. Farnsworth’s clone is trying to figure out how to repair the Planet Express ship. It uses an exotic engine powered by exotic materials (dark matter). At the critical moment, the clone exclaims, “I understand how the engines work now. It came to me in a dream. The engines don’t move the ship at all. The ship stays where it is, and the engines move the universe around it.” This is like laying a sheet of paper on a flat surface, just touching it with your finger, and moving it around with your hand. Your finger’s location stays the same relative surface, but the paper changes its position relative to the ship. It’s a completely ingenious design, but what if Prof. Farnsworth built another ship? Now imagine adding a second finger to that piece of paper and another hand to move the paper relative to that finger. What if the two ships wanted to go in different directions? The ingenious design is now completely wrong and cannot be fixed. In order to add another ship, the entire design has to be thrown out.

There is a similar situation in computer programming. It used to be called a global variable. Now it’s more trendy to call them static or class variables. These are variables that hold a single value for the entire system in which the class (or global variable) operates. At first glance, they can seem to be an ingenious way to solve a problem. Any time you think about using them, though, you should think about what might happen if you add another “ship” to the system. Who knows? Maybe the whole universe would implode.

A Few Reasons E-Book Readers Suck

I’m a casual video game player, but I’ve played a lot of games in my day. My first video arcade experience was at the Asteroids machine in the Stop-n-Go. After that game filled my body with adrenaline, it was all over for me. The prospect that I could control a whole ship that was able to blow up asteroids in space with just a dial and a button was magical. Later, playing more complex games like Ultima IV, I realized that there could be richer experiences that required a few more buttons. In college, I was introduced the the MUD (Multi-User Dungeon), which I never really got into. More recently, I started playing World of Warcraft, which I consider the most successful MUD in the world today. Along the way, I have played many console games from first-person shooters (Halo to Ghost Recon), platform games (Super Mario World to Mirror’s Edge, which is just fantastic for this genre), and, to a lesser extend, Real-Time Strategy (RTS) games (I love Pikmin in this genre). Along the way, I came to a startling conclusion:

Games can be too realistic.

At a certain point, games that increase in realism achieve a level of realism that… well, it makes them actually real. Real in the sense that all the real-life frustration you might experience in, say, infiltrating a heavily-defended building or training for the military. Let me put it this way: I would never play Halo if I had to go through 6 week of training in-game to qualify to play the game. Part of the magic of video gaming is that it lets people do things they would never have a chance (or the time, or even the physical ability) to do in real life.

If you’re still with me, you’re probably wondering what any of this has to do with E-Book Readers. I recently came across an article on a dual-screen E-Book Reader that says it will be “awesome”. Why? What is so awesome about using 60 years of accumulated technology to emulate 500 year-old book technology? Isn’t this akin to emulating basic training in a first-person shooter video game? Sure, a certain segment of the population (read: geeks) get thrilled at the prospect of running the Commodore 64 on their iPhones, but is that really the right approach?

When it comes to consuming long-form media on a high-tech device, what is needed is a new approach. Instead of emulating 8-bit technology, how about inventing a whole new 64-bit technology? In my life, audio books (in combination with the iPhone, a technology platform I always have with me) radically changed the way I enjoyed books. For one, it enabled me to read for pleasure again. I find that I’m very rarely in a situation where I can drag a book along with me, much less sit down and use my two hands and two eyes to actually read it. This new-fangled technology enables me to enjoy a book in a way the original author probably never intended. A first-person shooter video game is vastly more enjoyable than having actual terrorists actually shoot assault rifles at me. Likewise, an audio book is vastly more convenient than an actual pulp book.

So I ask, what advantage does an E-Book Reader have over a paper book? In my case, it’s actually worse. I don’t mind if I lose a $4.95 paperback book. I can drop it, dogear it, write in it, drop it, kick it, rub sand on it, prop a door open with it, hit my friend over the head with it to get his attention… Nothing I would do with two A4-sized pieces of LCD glass (okay, maybe prop a door open with it).

This whole e-book reader craze is destined to be relegated to history as a stop-gap technology to help folks who can’t or won’t adapt to new media technologies. In a few years, it should blow over after the “gee whiz” factor has passed by. Either that, or it will be adapted, like audio book technology, to a new, more fertile environment, and it will be another 15-year overnight success.

Update: Fake Steve Jobs and students at Princeton apparently agree with me. Glad I could help.

Update: FSJ almost quotes me: “There is no point in moving to digital readers if we’re just going to do what we did on paper.” Here’s hoping RSJ feels the same and actually comes out with a decent tablet.

This is a little out of the ordinary for me, but I just had to say something. I recently saw the 2009 Star Trek movie directed by J.J. Abrams. First, I am a Lost fan, and just seeing the “Bad Robot” banner gives me butterflies in my stomach. That guy is great. This Star Trek has some of the grit of Battlestar Galactica (the best Sci Fi drama since Farscape, IMHO) while maintaining the wholesome goodness of the original series and movies.

But there was a problem. The Star Trek movies’ audiences are primarily people who are both technologically savvy and, if not anal-retentive, at least have an almost obsessive-compulsive attention to detail. Okay, so full disclosure: I just described myself. And I just can’t let go of the worst set design choice in this decade: using an actual bar code scanner on the bridge of the new/original Enterprise. Don’t believe me? Take a look at this shot, particularly in the very prominent position on top of the helm, between the two bridge personnel:

Bridge of the Enterprise NCC-1701 (2009)

Bridge of the Enterprise NCC-1701 (2009)

And now, let me introduce you to the Symbol/Motorola M2000 general purpose barcode scanner:

Motorola M2000 Omnidirectional Barcode Scanner

Motorola M2000 Omnidirectional Barcode Scanner

Tell me I’m wrong! This anachronism totally blew the movie for me. I see these in department stores. I don’t want to think about buying groceries or T-shirts while I’m watching a futuristic geek-fest!

What other interesting and/or distracting props have you seen in movies?

I’ve had a few situations with photos that just don’t look right when iPhoto corrects the red eye. The algorithm in iPhoto is sometimes just too sloppy, ruining nearby parts of the photo, especially if there is too much red tone to the skin around the eye. I needed a better way, and here it is.

What you’ll need

  1. A tool with “Instant Alpha”, the feature introduced in OS X Leopard’s Preview.app tool. You could also use Keynote, but I find Preview to be the best
  2. A compositing tool. Compositing is the process of combining two images. I use OmniGraffle for this, but you could also use Keynote, OpenOffice, etc. The key here is that you will need to draw an object and be able to place it behind your photo after applying the Instance Alpha.

Step by step

Try it in iPhoto

Here is my original picture:

Original Red Eye Picture

Original Red Eye Picture

And here is iPhoto’s attempt at red eye reduction:

iPhoto Red Eye Reduction

iPhoto Red Eye Reduction

Note the over-ambitious red eye reduction turning my green/brown eye blue and bruised!

Get the red out

To get the red out, I use OS X’s Preview.app to apply an Instant Alpha filter. This tool is great and gives a lot of control over the size of the selected red area. Open the tool, click and drag starting in the “reddest” part of the red eye, and drag until the entire pupil of the eye is selected by the tool. Repeat for the second eye, then hit the “enter” key to apply the mask. You should see some ghostly white eyes in your picture.

Instant Alpha in Preview.app

Instant Alpha in Preview.app

Red Eye Removed in Preview.app

Red Eye Removed in Preview.app

Get the black in

Okay, so far I’ve gotten the red eye out of your picture, but it’s not anything you’re likely to publish. To do this step, we copy the image into the compositing tool (OmniGraffle in this case). OmniGraffle is what they call an object drawing tool. After copying the image into the tool, we just draw a plan rectangle right over the eyes. Make it big, but smaller than the original picture. Next, change its color to black. You may want to play with the color to get the best result; adding some red to the black will soften any red edges that might have been left during the Instance Alpha stage. Or, you might feel that a deep gray gives better results. You should really zoom in on the eyes to make sure you’re satisfied, because you’re almost done!

Build the Rectangle

Build the Rectangle

Change the Rectangle Color to Black

Change the Rectangle Color to Black

Push the Rectangle Behind the Pic

Push the Rectangle Behind the Pic

Save your new picture

Once you’re satisfied with the picture, use the Export command of your compositing tool to save the image as a JPG. Since it’s a photo, this is the format I highly recommend. You could also use PNG, but JPG will give you the most natural results.

Export Options

Export Options

Final Result

Final Result

Finish it off

It is only at this step that I recommend you bring the new photo into your photo tool (I use iPhoto) to adjust exposure, etc. Adjusting these setting before getting the red out could make the process more difficult. Crop, size, and upload your photo to your favorite service!

Follow

Get every new post delivered to your Inbox.