How to launch an online platform

I attended the Bebo developer platform announcement this morning in San Francisco. The announcement seemed to go down very well based on immediate response, though only time will tell if the expected impact is achieved.

Bebo schwag
It’s clear that a formula for launching this kind of stuff exists, and I think Bebo did a great job of giving it their own flavor. The overall format Bebo used was standard:

  • Invite people to a nice place and give them some free stuff
  • Give a presentation including a video showing customer testimonials
  • Let the founder or product owner or thought leader present the product
  • Parade the partners on stage
  • Provide demos for people to peruse after the presentation
  • Keep it short

But the nuances in the formula are what make an online platform launch successful.

  1. Create an invite-only experience: This is true with restaurants, art galleries, clubs and just about any socially-driven service. Make a select few feel important by treating them differently, and they will then be your advocate. Bebo invited press and partners to a small-ish rooom to give their presentation at the Metreon. Those people then felt responsible for spreading the news.
  2. Make it newsworthy: I wouldn’t say that the Bebo platform was a secret, by any means, but the features that make it worth talking about were kept secret until the event. In particular, the crowd seemed very pleased to hear that Bebo decided to emulate Facebook’s success by making their platform fully compatible with Facebook’s.
  3. Follow standards: Developers are not generally interested in proprietary environments unless there is a substantial gain to be made by leveraging that environment. Platforms on the Internet should default to known and proven standards, and when they do deviate, there should be compelling reason to do so. Bebo indicated that there might be features in the future that are Bebo-specific such as micropayments, and I suspect the developer community would be happy to customize their apps for Bebo when those features are ready.
  4. Prime the pump with partners: An ecosystem is not an ecosystem if it doesn’t have partners. So, don’t launch a service for partners with no partners already committed. But more than that, partners are proofpoints that the wider market wants to validate that what you offer is in fact real. Give them the stage. Make them successful, so others want to follow suit. I wasn’t all that impressed with the NBC Universal app showcased at the Bebo event, but the Gaia Online and Flixster apps were solid. And the 20 or so partners demoing in the back of the room after the presentations were great evangelists for the platform. They were proud to be there and happy to sing Bebo’s praises.
  5. Be real: I’m always a sucker for a self-deprecating joker, but Bebo founder Michael Birch backed up the laughs with substance. He admitted that they intend to follow Facebook and do whatever they do which is a totally viable strategy in this space, at this point in time. Of course, he gave himself a great defense should they get pounded by the press, but his approach was very refreshing in a market that’s increasingly crowded full with ambition and arrogance.

Again, the response by developers and then the subsequent uptake by users will be the real indicators of success. But Bebo gave themselves as good a start as any by getting the launch off on the right foot.

How to fix building construction bureaucracy

Sometimes I forget to step outside of our little bubble here and see how people use or in fact don’t use the Internet. When I get that chance I often wonder if anything I’m doing in my career actually matters to anyone.

Usually, however, I’m reminded that even though the Internet isn’t weaved into every aspect of everything, it has great potential in places you might not consider.

For example, I’ve been remodelling my house to make room for a new little roommate due to be delivered in September. I’m trying to do most of the work myself or with help from friends and neighbors. I’m trying to save money, but I also really enjoy it. It’s a fantastic way to reconnect with the things that matter…food, shelter, love and life.

Well, I made the mistake of working without permits fully aware that I probably should have them. It’s my natural inclination to run around bureaucracy whenever possible.


As luck would have it, just as the pile of demolition debris on the sidewalk outside my house was at its worst, a building inspector happened to drive by on his way to another job. He asked to see my permit to which I replied, “The boss isn’t here. Can you come back later?”

The building inspector just laughed. After pleading a bit and failing, I started making calls to get drawings and to sort out the permits.

It was at this moment I realized how much building planning and construction could benefit from the advances made in the Internet market the last few years. The part of construction that people hate most is the one that is perhaps the most important. And it is this part that the Internet is incredibly well-suited to improve.

Admittedly, the permit process was not actually that painful and relatively cheap, too. I have spent in total maybe 1 day dealing with permits and drawings, so far, with a bit more to come, I’m sure.

But the desired effect of permitting jobs is sorely underserved by its process.

At the end of the day what you want is the highest building quality possible. You want builders using proven methods with at least semi-predictable outcomes. You want to make sure nobody gets hurt. And you want incentives for people to share expertise and information.

Rather than be a gatekeeper, the city needs to be an enabler.

One of the brochures I read called “How to Obtain a Permit” includes a whitelist of project types. I’m apparently allowed to put down carpets and hang things on my walls without a permit. Glad to know that.

Strangely, after explaining all the ways the city asserts itself into the process, on the very last page of the brochure it then says, “Remember, we are here to assist you. If you have any questions about your project, please give us a call!” I didn’t meet one person in the 6 queues I waded through the first morning who wanted to help me. They were mostly bored out of their brains.

Instead, the city should be putting that brainpower to work finding ways to lubricate conversation and collaboration around solving building problems. If the building community was in fact a community powered by thoughtful city-employed engineers, then I would be much more interested in working with them. I might even become dependent on them.

For example, if they helped me organize, store, print and even share my plans, then I’d be more than happy to let them keep my most current drawings, the actual plans I’m using to build with. If they could connect me to licensed contractors and certified service providers, I’d gladly give them my budget.

As it stands, my incentive is to avoid them and hide information whenever possible.

Imagine if I was able to submit a simple SketchUp plan to a construction service marketplace. I could then sit back and watch architects and interior designers bid for the planning work. My friends in the network could recommend contractors. Tools and parts suppliers could offer me discounts knowing exactly what I needed for the job. I could rate everything that happens and contribute to the reputation of any node in the ecosystem.

Imagine how much more value would be created in the home buying market if a potential buyer could see all this data on a house that was for sale. I might be able to sell my home for a higher price if my remodel was done using highly reputable providers. There would be a financial incentive for me to document everything and to get the right certifications on the work.

Imagine lenders knowing that I’m an excellent remodeller based on my reputation and sales track record. I might be able to negotiate better terms for a loan or even solicit competing bids for my mortgage on the next house I want to invest in.

At every step in the process, there is a role for the city government to add value and thus become more relevant. Then the more I contribute, the more it knows about what’s happening. The more it knows, the more effective it can be in driving better standards and improving safety and legislating where necessary.

My mind spins at the possibilities in such a world. Of course, when you have a hammer everything looks like a nail. But it seems to me that the building permit and inspection business is broken in exactly the places that the Internet is more than capable of fixing.

How to layer postproduction visuals in a screencast

Jeremy Zawodny and I produced another screencast last week, a look inside Pipes with Pasha Sadri and Ed Ho. The Pipes guys shared their insights while we asked a few questions and recorded the screen and the audio.

I’ve been trying to improve on each screencast with a new trick or some efficiency. This time I tried to mix in some relevant still shots in the editing process to support the voice over.

Camtasia was a little stickier here but still very easy to use. After setting up the production and editing out some bits, I used SnagIt to capture web site screen shots and crop them to focus on a small area. I imported them into the production. Then I added the screen shots to the Picture-in-picture track. Lastly, I zoomed in on each PIP file so it took up the whole screen and slid it along the timeline to get the right positioning with the audio.

There’s a segment toward the end of the video where Pasha is saying some really interesting stuff, however I didn’t have anything relevant to splice in visually. So, I didn’t quite get this right. But you’ll see that it works nicely in certain parts of the video. It keeps the pace going while people are talking. It also allows you to grab additional media that you didn’t think to pull up while recording the original video.

For example, Pasha mentions that there are several sites that have begun creating tutorials for Pipes, so I grabbed screensots of 3 that I found and layered them in.

I don’t think this is what the software was intended to do, so please tell me if you know a better way to accomplish this same effect. Here is the screencast which is also available on Yahoo! Video:

A community site without a community

Taking a little time at home last week gave me a chance to play around with one of my experiments that was nearly at its end. FlipBait is a simple Pligg/MediaWiki site that pokes fun at the dotcom golddiggers out there.


It’s mostly a sandbox for me both technically and journalistically. But it’s not really helping to inform or build community the way I hoped.

First, after a month I still have no participants. There have been several passersby, but a group publishing site needs to have a core team looking after its well being.

Second, it’s just too much work in its current form for me to keep posting to it.

I sort of expected this to happen, but I’m a big fan of experimentation. So, I thought I might analyze the issues for a few blog posts and close it down…

…but then Pligg 9 was released.

The new version of this Digg-like CMS added a key feature that may alter the dynamics of the site completely: Feed Importing.

I give it a few RSS feeds. It then imports the headlines from those feeds automatically.

Now, I have a bunch of feeds all pouring headlines into FlipBait throughout the day. I’m aggregating the usual suspects like TechCrunch and GigaOM and VentureBeat, but I also found a few sources from various searches that effectively round out the breadth of the coverage

I can find new dotcom golddiggers without fail every day.

This is very cool. Though you can see back in the Pligg forum archives that there was some debate about whether this feature would destroy the whole dynamic of voting-based publishing. That may be true, but it’s just too useful not to have.

Now, this might be the most interesting part…

I’m also importing stories from del.icio.us using a new tag: “flipbait“. That means that if you tag an article with “flipbait”, Pligg will automatically import that article and make it available to the FlipBait community. That’s how I’m entering my own favorite posts for the site as opposed to using the ‘submit’ function directly at flipbait.com.

You don’t ever have to visit the domain, actually, because you can pull articles to read from the RSS feed and submit articles to the site just by tagging as you already do.

Hmmm…what does that mean? Interesting question. Can a meaningful community form around a word that represents an idea?

Preview of the del.icio.us publisher api

I just posted a short screencast on the YDN blog of the cool new publisher api coming from del.icio.us soon. I’ve also embedded the video below. Lots of interesting possibilities with this new service, for sure.

Embed video:
“>

How to offer simple RSS badges for your users

The key breakthrough that made it possible for YouTube to ride on MySpace’s heavy traffic coattails into its current state as a mass media service is the concept of widgets, often called badges in related contexts. Although offering widgets or badges may seem like a far off idea for most web site owners to internalize yet, there are a few tools that can make this a snap to offer your users if you’re ready for it.

(I’ll assume here that you already know what widgets and badges are. If you don’t, I’ve been tagging articles addressing the topic of widgets that may be helpful.)

In the case of YouTube, they allowed users to post the YouTube video player to any web page with a simple copy and paste operation. Since most web site owners are dealing mostly with text, the equvilent would likely be a feed of RSS content that people could display on a web page. It would clearly be best to allow your users to display a feed of the things they are contributing to your web site, but if you don’t have user-contributed data to give back to your users it’s still worth trying to offer this functionality using your own content to see what happens.

Here’s a really cool tool I recently found that made it possible for me to offer badges to users on the FlipBait web site. It’s an open source service called Feed2JS, and it appears to be developed by Alan Levine. It requires another open source service called MagpieRSS to operate, but MagpieRSS takes maybe 10 minutes at most to download and install.

After you download and install these scripts you can point to a feed you want to display nicely and get the code back that you can include on any web site to show that feed.

In other words, you now have a badge platform to offer your uses.

I tried this out on the FlipBait web site, and it worked out of the gate. In fact, you can now see on my blog sidebar here the posts I’ve submitted to FlipBait. Each user on the site has access to his badge via his profile page. Now everyone can take their contributions with them wherever their “Internet startup news” identity gets expressed.

It couldn’t have been much easier to setup either. I’m hoping, actually, that the Pligg team incorporates something like this into the source code.

There are also some nice formatting capabilites in Feed2JS that would make people happy, I’m sure. But that adds some complexity I’ll address at a later date. The important thing is to push out a feature like this, watch for uptake, and then evolve it.

I’d be interested to know if other people have tried any other similar solutions or used tools from some of the recent startups in this space and what their experiences have been. Please comment or blog about it if you’ve found another way to accomplish this without having to write the code yourself.

How we made the BBAuth screencast

The news that seemed to get overlooked by the amazingness that became Hack Day was the release of a login API, BBAuth, or Browser-based Authentication. This new service allows any web site or web application to identify a user who has a Yahoo! ID with the user’s consent. Dan Theurer explains it on his blog:

…instead of creating your own sign-up flow, which requires users to pick yet another username and password, you can let them sign in with their existing Yahoo! account.

My mind keeps spinning thinking of the implications of this…more on that in a later post.

It was immediately obvious to me when I heard about it that this concept was going to be hard to fully grok without some visuals to explain it. So I sat with Dan yesterday to create a video walk-through that might help people digest it (myself included). Here is a 5 minute screencast talking about what it is and an example of it in action (also available on the YDN blog and on Yahoo! Video):

The screencast itself took only a few minutes in total to produce. Here’s how it went down:

  1. I closed all my applications on my laptop other than my browser (or so I thought) and launched Camtasia
  2. We spent 5 minutes discussing what we were going to say.
  3. I clicked ‘record’.
  4. We talked for 5 minutes.
  5. I clicked ‘stop’.
  6. I selected the output settings and it then produced a video file for me.
  7. DONE. That part took about 20 minutes.

The next part, posting to a video sharing site, got a little sticky, but here’s what I learned:

  • I tried Yahoo! Video, JumpCut and YouTube.
  • Outputting my screencast in 320×240 resolution saves a lot of time for the video sharing sites
  • Yahoo! Video liked the MPEG4 format most. YouTube claims the same, though it wasn’t obvious after trying a few formats which one it liked most.
  • JumpCut was a snap to use, but the output quality was a little fuzzier
  • Titles…I forgot the damn titles, and it just looked too weak without some kind of intro and outro. Camtasia gives you a couple of very simple options. I added an intro title in less than 5 minutes.
  • Logo! Ugh. After encoding it about 8 times to get the right format I realized the logo really needed to be in there:
    1. I took a quick Snag-It screenshot of the YDN web site, played with it a bit and made a simple title screen.
    2. Saved it as a jpeg
    3. Imported into my Camtasia screencast
    4. Inserted the title image in the beginning and a variation of the same at the end
    5. Dropped a transition between the title frames and the video
    6. Titles DONE. That took less than 30 minutes…could have taken 2 seconds if I was prepared.
  • Wait…the screen wasn’t big enough. You couldn’t see the graphic that Dan points to in his explanation because it’s too small. Not a problem. Camtasia includes a simple zoom tool:
    1. I played the screencast again and found where I needed to zoom.
    2. Inserted opening zoom marker
    3. Selected zoom size. Clicked done.
    4. Found the end of the segment where I wanted to zoom out.
    5. Inserted another zoom marker.
    6. Opened zoom window back up to full size.
    7. DONE. Maybe 15 minutes to do that.
  • Output one last time
  • Upload.
  • DONE

Then all I had to do was write a blog post and embed the video in that post. That took about 10 minutes.

All in all, I probably spent close to 2 hours beginning to end producing this screencast, but most of that was learning a few tricks. Next time I do this, I bet I can complete the whole thing from launching Camtasia to posting on a blog in 45 minutes, possibly less.