Archive | Technology

Nokia: Let ’em Make Cake…

The future is wireless, or at least that is what Nokia, Ericsson and a host of startups and network operators are earnestly hoping. But the quick success of 3G – The Third Generation of mobile telephony – is more than profitable icing for these companies; it has now become a matter of survival….

This article, which ran in the February, 2001 issue of Tornado Insider magazine, looks at the overall climate in European development of 3G, and then explores how each of Europe’s largest telecom networking manufacturers, Ericsson and Nokia, is coping with the challenge.


In the main lobby of Nokia House, a wood-steel-and-glass curiosity in the Finnish city of Espoo, is an impromptu cell-phone museum. In it, alongside all the sexy phones that rocketed the Finnish manufacturer to No. 1 in handsets, we viewed the suitcase-sized Mobira Talkman “portable” cell phone all the yuppies were buying in 1986.

For all its prescience in handset design and user habits, no one knows better than Nokia how difficult it is to predict future trends. Painfully aware of industry missteps earlier in the 1990s, Nokia discusses 3G with a reverence, and makes predictions about “classes” of applications and “styles” of usage.

But it won’t predict specific applications that will emerge as killers. The logic is cunning: Predict the next SMS? Thanks, no. But it will hot-house every person with an idea for an application, maintain an open API (application program interface), provide technical details to everyone, and market support to the successful few. Now there’s a situation in which Nokia doesn’t care what the next big thing is, because in theory at least, it will already have it under its wing.

“Nokia’s approach is absolutely that,” says Mika Koskinen, CEO of Entirem, which develops a secure wireless transaction platform for banks and portals in cooperation with Nokia. “The general impression is: focus very strongly on our core business, and don’t then get too heavily involved with third parties, at least at the early stages.”

For startups, this is great news. “It’s impossible for us to define the killer apps,” says Mikko Pyykka, Nokia’s 3G application marketing manager, “so we just need to cooperate with a large number of developers. But we think that at the end of the day the real killers will be somehow related to messaging.”

Messaging is what Nokia says will be the main application when it comes to revenues for 3G, and it, like Ericsson, farms its developer community for the latest and greatest. But unlike Ericsson, Nokia seems to be cut-and-dried about the application developers in the overall food chain – they’re valued, to be sure, but they’re outsiders.

During the Sydney Olympics, MobileChannel.Network set up a one-way messaging system for Nokia to provide sport content services. MC.N now cooperates with Nokia on a number of levels. “We did some basic business studies with them,” says MC.N’s Janne Makinen.

“It’s kind of a two-way development cycle. They give us technical access and early releases of new products for us to develop new versions of our product, and we give them feedback of how things work. They also give us a live environment in which to test our solutions before we go to operators.” Makinen says he believes that Nokia’s involvement will play a significant role in MC.N’s development as a company, creating a need for MC.N products by delivering Nokia’s goal of a complete end-to-end system, from network to handsets to applications.

Another Nokia developer is Genimap (formerly Karttakeskus), which began cooperation with Nokia in 1996 when it released its first Communicator – Nokia’s hinged-brick PDA, aimed at the high-flying business and “poser” crowd (that’s Nokia’s own internal market-segment term). The first mapping products were popular address finders; the user entered an address and was served a digital map.

“There are different levels of cooperation,” says Mikko Salonen, CEO of Genimap. “In the beginning, it was very important for us to get cooperation on development tools – how to make WAP-based applications, and so on. Nokia gave us technical papers and documentation, but there was also a two-way exchange of technical information – we also gave our ideas to them and shared our views.”

Currently, Genimap is working with Nokia to develop location-based services such as “turn-left, turn-right” maps complete with street-level content. But Salonen stops short of saying that his company’s relationship with Nokia has changed his life. “The relationship has accelerated our business plan, and they are important for us, but I’m not willing to say this is so very extraordinary or different – they’re just a partner,” he says.

Nokia likes it that way. “In some cases, as with customer-specific services tailored and run for one operator, we might have an exclusive,” says Nokia 3G strategist Ilkka Pukkila, “but we would never go with one single application provider, for example rich SMS. Our strategy is to give operators as much choice as we can, but to offer value added – such as with the interplay between applications, like adding theater tickets to your datebook. We control the platform, and integrate the applications to suit the specific needs of the customer.”

Pukkila says very soon you and I will use our “personal trusted device,” a kind of handheld multimedia terminal to buy a Big Mac. Why would we ever buy a burger with the phone? Pukkila points out that is the wrong way of thinking. “You wouldn’t want to do it just out of the blue. But, for example, McDonald’s will have its own 3G wireless LAN in the restaurant, and when you walk in, you’d get, for example, messages about specials, which you could buy with your phone. And while you ate, you could view McDonald’s content. Or you could pre-order and pick up your paid-for meal with no waiting when you arrived,” he says.

To Nokia, it would seem, development of these kinds of applications will for the foreseeable future be left to the third parties, allowing Nokia – and McDonald’s for that matter – to pick the best of breed of every single app it buys. In fostering this climate, Nokia will likely never again have to say that the latest hot thing just never appeared on its radar.

In concentrating on the network vendors’ efforts to develop applications, we leave un-discussed, for now, the solidity of the model of UMTS as telecom savior. This could be far from certain. Industry analysts, notably those within Forrester Research, have been increasingly pointing to flaws in the revenue models the telecom operators are banking on. “There’s no killer application. There is no such thing. Let’s go for a killer environment. We don’t want to be bound by specific applications.”

This quote, from an executive at a European mobile operator, was featured in Lars Godell’s January Forrester report on the future of UMTS, which the report called “a survival question.”

“Of course it makes more sense for the equipment manufacturers to do the lion’s share of the application hot-housing,” Godell told Tornado-Insider, “But every operator must also have an open mind, to nurture niche and local applications developers as well as creating strong local partnerships.”

Forrester remains skeptical that, even with the hottest applications, incumbent European mobile operators as we know them today will survive the shakeout. Forrester predicts that by 2011, impending pressure from the network vendors to upgrade again to 4G technology will fuel a desperation to reach profits from UMTS amid an environment of saturated subscriber markets, continued spending on marketing and network upgrades and declining average revenue per user (ARPU).

Could it be that eventually the network vendors end up swapping gear for equity to bail out operators? Will forced consolidation result in the death of all independent operators and allow the survival of only the largest pan-European players?

Other analysts say that the Forrester report is overly gloomy and that it does not take into account the improved capital efficiency of investment in 3G as compared to 2G. But most agree that evolution, and indeed consolidation is not an unlikely prediction. “The shape of the mobile communications industry has not stopped evolving,” said Peter Knox, a telecom analyst at Commerzbank Securities, “and consolidation is probably a likely result.”

In-Q-Tel as Cyber Security Tsar? Weirder Things Have Happened

ed_harris[This is the second of a two-part blogpost that originally appeared in Plausible Deniability, the blog of the enterprise security practice at The 451 Group. I wrote it in August, 2008.]

Last Friday I began to discuss In-Q-Tel and its investment in Veracode, and went a little into IQT’s investment strategy. As we said, IQT exists to determine an answer the question, ‘Is it possible to solve [problem set here]?’. If the answer is, ‘yes’, IQT’s job is to identify fiscally viable, practically capable, innovative private organizations which might be able to solve the problem at hand.

We also said that the political winds in Washington were shifting like Dick Nixon’s eyes during a bad news briefing.

Among the key problems I think IQT is now looking at is the oft-lamented fact that the US has no cogent strategy to deal with cyberwarfare, no leadership on the issue (in fact the issues surrounding this are so complex it’s hard to find anything people don’t want to talk about more. Simultaneously, political winds are blowing funds in the form of budget dollars from many places (including my wallet) towards the white tower that is the Office of the Director of National Intelligence.

It is said that the CIA under George Tenet somehow recognized that private industry was surpassing government talent in the field of technological innovation. Is it possible that the CIA was prescient enough to recognize over the past few years that its budgetary influence was waning and that to re-increase its stature in the intelligence community it would need to get really geeky about cyber-warfare and get itself some really cool kit?

The CIA? Prescient? It’s become so hip to make the CIA the butt of jokes recently that people forget (this is a technological, not a political, discussion) just how much seriously cool stuff it has done.

In his 2002 review of The Bourne Identity, the New York Times’ A.O. Scott wrote that,

The movie … trots out a quaint view of the C.I.A. as not only bottomlessly malevolent, but also endlessly and terrifyingly competent. Shortly after they see Marie’s image on a security camera satellite feed, the folks at Langley are in possession of her entire life history, and they are able to track her movements across Europe with a few clicks of the mouse. This is inadvertently hilarious in light of recent news reports. If Marie had only thought to disguise herself as an international terrorist, she might never have attracted the agency’s notice in the first place.

Well, IQT itself has been a monster success. Its investments are absolutely classic examples of how to do it right: tons of due diligence, heaps of knowledgeable people asking sensible questions about the technology, the leadership of the innovative company-prospect. That they don’t make large investments (generally they’re capped at $3m) or take an equity position is a question of taste, or style, or, you know, propaganda value. But the investments are made in the form of, essentially, non-recurring engineering fees to make something wicked-cool out of something more pedestrian, and come complete with a promise to buy a bunch of it if it works out right.

But back to cyberwarfare.

If our current policies and the reality of the US’ digital security stance is any indication, policy makers would rather read tax code than infosec material – hell, even I’m reading a book on tax code, and I think this security stuff is a gas.

You take a look at something like H. R. 130, the Smarter Funding for All of America’s Homeland Security Act of 2007, whose dense prose mentions the word ‘cyber’ a total of once, and you wonder. I digress.).

We wrote in this blog back in April about Aaron Turner & Michael Assante’s excellent article in CSO Magazine, in which they compare and contrast the response in the 18th century by the United States to pirates on the high seas with today’s federal response to Internet crime. (read Turner’s prescient testimony to the House Committee on Homeland Security, Subcommittee on Emerging Threats, Cybersecurity, Science and technology here).

In that blog post we also mentioned that, back in 2004 MIT Technology Review published a terrific piece by Eric Hellweg, Cyber Security’s Cassandra Syndrome which discussed the stalled and possibly addle-headed Bush administration approach to the problem of leadership of the effort to protect the nation’s computing infrastructure.

The problem, Hellweg argued, was that you couldn’t get the right people to do the job of protecting the nation’s critical infrastructure (which is basically our government’s ability to use computers and the Internet) because, well, it’s impossible. You can’t get anyone to take all the responsibility while having none of the authority to do what’s necessary. The fact of the matter is that it’s not defense against being attacked, it’s long since devolved to the point that our government should be asking itself, to paraphrase Ed Harris playing Gene Kranz in Apollo 13, ‘Whadda we got in this country’s critical infrastructure that’s good?’

More specifically, how badly are we already owned by foreign nation-states and commercial entities, and how can we look at ways to fix that? Then we can start talking about ways to have ‘Smarter Funding for All of America’s Homeland Security’.

So: IQT as savior? Are Darby and Geer (who raised some fascinating and common sense points on security metrics in his testimony before the Subcommittee on Emerging Threats, Cybersecurity, and Science and Technology last year) and the infosec team at IQT the stand-in Cyber tsars by forfeit?

Well, there’s an argument for it. Geer (and Assante, by the way) sits on The Center for Strategic and International Studies (CSIS) Commission on Cybersecurity for the 44th Presidency, which is working to develop recommendations for a comprehensive strategy to improve cybersecurity in federal systems and in critical infrastructure. So do a lot of other really smart people. But as several of them have admitted, the danger is that the need to reach high level consensus will lead to watered-down pablum despite the best intentions of a truly smart group of people.

It’s Washington. It’s almost inevitable. Don’t get me wrong: some great ideas will come out of the CSIS deliberations. Our biggest fear is that politics will pare down the final recommendations to be on par with ‘Don’t-click-on-attachments, wear-mittens, study-hard-caliber advice.

A look at the political climate in Washington, especially that surrounding the intelligence community, shows that the winds are being shifted by a Bush administration miffed at…Well, who knows.

The unclassified Annual Threat Assessment of the Director of National Intelligence from this past February had this to say about Cyber warfare:

The US information infrastructure — including telecommunications and computer networks and systems, and the data that reside on them — is critical to virtually every aspect of modern life. Therefore, threats to our IT infrastructure are an important focus of the Intelligence Community. As government, private sector, and personal activities continue to move to networked operations, as our digital systems add ever more capabilities, as wireless systems become even more ubiquitous, and as the design, manufacture, and service of information technology has moved overseas, our vulnerabilities will continue to grow.

In an Op-Ed piece in The Wall St Journal this past April by DNI Mike McConnell and House Intelligence Committee sub-committee chair Anna G. Eshoo, they talked about responsible domestic surveillance:

If we are going to ask our intelligence agencies to help defend our country, we need to carefully construct policies that give them access to this information when necessary, and protect the rights of Americans. The National Security Agency, for example, is governed by strict rules that protect the information of U.S. citizens. It must apply protections to all of its foreign surveillance activities, regardless of the source. As we add new authorities and programs to secure our country, we must ensure appropriate safeguards and protections to secure our liberties. We must maintain the balance between safety and freedom.

And then pooh-poohed technology…

Too often, our country has invested in dazzling new technology as the solution to our intelligence woes. Technology is vitally important. But a computer is only as good as the person who programs it. No piece of technology can substitute human judgment. A computer — even one that costs millions — cannot recruit a spy.

Meanwhile, Joe Lieberman (IND-CT) and Susan Collins (R-ME) are asking good questions that are worth reading. If you’re up for it, answers to those questions are in a letter (that inadvertently serves to highlight the marvels of redaction.

We, like you, can only speculate about what the CIA will do with Veracode’s technology, but I would be willing to bet it has something to do with finding weaknesses in code. Which is quite useful stuff, if it works. I bet there’s more where it came from, and I bet IQT will be and is looking at other infosec investments.

And with regards to Cyber warfare, we would say that, at the very least, IQT and its staffers are raising the level of discourse about the problems faced by the US – or at least by the CIA.

IQT turns to old school shaking-and-breaking with Veracode investment

Ernie8Salesman(Written August 14th, 2008 as part I of a two-part post for Plausible Deniability, the blog of The 451 Group’s Enterprise Security Practice. Read Part II here)

In late July, binary code analysis-as-a-service provider Veracode announced that it had received an investment from In-Q-Tel, the not-for-profit that serves as the venture arm of the Central Intelligence Agency.

This is a two part blog about IQT, its investment in Veracode (and what I think is IQT’s future), and then how IQT and the CIA intend to compete in an intelligence community atmosphere whose political winds are shifting like Dick Nixon’s eyes at a tough press briefing. To be fair, whether Veracode works or not, in fact the specifics of the Veracode investment are less interesting to us than the apparent shift in investment strategy at IQT.

Many people sort of wink knowingly when they hear about IQT, but don’t really have a sense of what the thing does, or in fact, its successes to date. The interesting thing, which we will go into in a bit, is that for the most part, IQT – which sounds really spooky and security-ish, has invested in hugely successful companies that could not be less directly related to information security.

Like OpGen, which does single molecule DNA analysis technology to identify and analyze microorganisms. The closest it gets to something like IPS is, well, IPS, or Infinite Power Solutions, which makes thin-film energy storage devices for microelectronics. It’s not like it’s a bunch of guys dressed like Lefty skulking around corridors in Silicon Valley saying, “Psssst! Hey Bud…Wanna take some NRE money?”

IQT exists to determine an answer the question, ‘Is it possible to solve [problem set here]?’. If the answer is, ‘yes’, IQTE’s job is to identify fiscally viable, practically capable, innovative private organizations which might be able to solve the problem at hand. By providing availability to these technologies, often by repurposing existing ones to accomplish things outside thre scope of their original design, IQT will increase the capabilities of its main customer, the CIA.

So going out on a long, unsupported limb, we think that the Veracode investment is the first in what will be a string of more traditional infosec investments, especially in the areas of digital identity and access control technologies addressing the concepts of digital persona. And we think that this is being driven in part by a several-years’ long political climate change that has led money and influence away from the CIA.

More on that next week.

By the way, IQT has not spoken to me about this issue other than typically flacky guff about the Veracode investment, to wit:

“…We have relatively recently expanded our internal organization in terms of the technology practice, derived from the specific problems we see; we spend lots of time examining not just the technical capabilities but the company itself. Our selection of Veracode speaks for itself; as a strategic investment firm it’s there to meet a strategy.” Blah, blah, blah [blah, blah, blah added].

IQT is, intriguingly, a 501(c)(3) not-for-profit organization, meaning, I suppose, that donations to it are tax deductible. I’ll try to donate fifty bucks and blog about how that works out, and what kind of newsletter I get for it.

Its activities are unclassified, and the company only deals with open source (in the spook sense, not the licensing sense) stuff.

Corona_spysat_camera_systemThere are some fascinating documents available on the CIA’s website about IQT’s history and on IQT’s website about the organization and how it has been operated, especially including a report excoriating the handwringing at the CIA as it tried to get past its, ‘Private sector? Pah! We made CORONA! We can read the gender of a newt from nine miles in space!’ attitude.

In the past, this has often not been a case of information security in a standard sense of the phrase. Still, the perception of IQT as a ‘security’ investor remains.

When the investment in Veracode was announced, many felt that it was par for the course – code analysis, security, CIA… All goes hand in hand.

Except that for most of its nearly ten year history, IQT has been anything but an investor in information security companies. Sure, since its inception it’s made investments in lots of things that seem cloak and dagger. Keyhole, the satellite imagery and 3-D Earth visualization company, was an early investment (in 2003 my accountant, reviewing my expenses, saw a credit card charge to ‘Keyhole’ and asked me if I was trying to write off a visit to a strip club); it ended up becoming Google Earth.

Many other investments are in areas like zoom lens development, chemical analysis tools, entity extraction and semantic analysis stuff. An analysis of the investments that the, uh, firm, has made since its inception under the George Tenet-led CIA of 1998 shows that most of the investments are of the build-a-cooler-mousetrap variety for things that seem decidedly bookish.

Of its 76 investments, IQT has made ten in what I would consider to be classic infosec investments. The other 66 investments have been spread across IQT’s other practice areas: Application Software and Analytics, Bio, Nano, and Chemical Technologies, Communications and Infrastructure and Embedded Systems and Power.

In-Q-Tel’s Digital Identity and Security group invests and seeks technologies in the truly hot if not downright sexy areas of identity management and access control as well as risk analysis capabilities, system design and analysis tools and policy definition and management. The investments have been:

  • 3VR Security (wicked-cool video facial recognition, like, ‘Show me where this guy has been on every camera you have’ kind of recognition – you have to watch the demo);
  • A4Vision (facial biometric and camera tracking systems technology acquired in January 2007 by BioScrypt, which itself was acquired in January, 2008 for $44.3m by L-1 Identity Solutions, Inc;
  • ArcSight (the enterprise security information management vendor which has since, of course, gone public and whose stock has recently recovered from elephant-tranquilizer-territory);
  • PKI and ID infrastructure vendor CoreStreet;
  • Encryption vendor Decru (bought by Network Appliance, Inc for $272.1m in June, 2005);
  • Master data management vendor Initiate Systems (whose $26m series F round takes total funding to about $61m);
  • Awesome-cool ‘where’s-that-RF-device?’ vendor Network Chemistry, which sold its security assets to Aruba in July 2007 and got waaaay pissed off with us when we priced that deal at $3m;
  • SRD Software (if you thought RSA’s Verid is spooky you should talk to these guys; it was acquired in January, 2005 by IBM for $69m);
  • and, of course, Veracode.

But the Veracode investment is interesting. The core technology of Veracode’s on-demand service was developed in 2002 at @stake, the pen testing and assessment firm acquired for $49m by Symantec in April, 2004. Want to know what’s also interesting? IQT President and CEO Chris Darby was Chairman and CEO of @stake. And the Veracode investment comes three months after the appointment by IQT of Dan Geer as its Chief Information Security Officer.

Now, Geer is a plain-spoken star in the security world. He was CTO at @stake (and a gazillion banks) before a much celebrated and reviled paper declaring that Microsoft was a national security threat hastened his departure; to quote another former @Staker, ‘It’s hard not to suck deep, deep down when you are Microsoft Windows’). Geer has also worked at Verdasys and on a host of other projects and organizations (more on him and some of those other projects tomorrow).

Based on what I know of IQT and of Geer (IE’ve never met Darby but I think Geer is a truly honorable guy), I don’t believe that there is much connection that Geer is on the board of Veracode and the IQT investment in Veracode. We understand that IQT has actual conflict-of-interest firewalls that are taken seriously by the firm, so we don’t believe Geer would have been involved in the investment decision on the IQT side). I am much more willing to believe, based on IQT’s investment history and the people involved that these guys simply knew that the stuff was out there to meet the specific requirements that were presented, and that Darby and Geer would naturally have said, ‘Oh yeah, that stuff – you wanna talk to the guys at Veracode.’ As I said, maybe I am being a sucker, but I think not.

For Veracode, the cachet of the investment is PR no one can buy, and the cache of having sold stuff to the CIA will be on its own enough to open doors at other government agencies (especially, one would assume, the three-letter kind). And for reasons I will get into tomorrow, I think that the statement I made earlier – that it is the first in what will prove to be a series of straight-up infosec and ID and Access management-related investments – will prove true.

Part II

Location Based Services: A Primer

[This article was jointly written with Rick Mitchell]

One of the hottest buzz terms these days [2000] is “location-based services” – products that can serve up extremely localized content to mobile phone users. Let’s say you are standing on a corner in Amsterdam and punch in a request to your handset or personal digital assistant for directions to the nearest cyber cafe. The request goes to a server that combines it with your position, and sends back useful information via SMS or WAP [Told you it was 2000].

This is not a 3G phenomenon – some mobile operators have already implemented mobile positioning systems. Those operators are offering customers entirely new types of value added services based on the their location and preferences.

There are several techniques to position subscribers, but the most common one today is based on the information of the cell (sector) and a calculated distance between the subscriber and the base station. In the US, the Federal Communications Commission has mandated that operators be able to locate users to effect emergency services. In Europe, commercial ventures are poised to drive the technology every bit as hard while the EU also pushes legal restraints forward.

Depending on network topology, current operators with the GSM standard can also calculate your position, to within, say, 300 to 400 meters, or even better when combined with intelligent software. Ericsson has, for about a year now, hooked up Tommy’s, a Stockholm-based parcel delivery service, with location services that allow the company to track its trucks throughout the city.

The level of accuracy desired for useful location based services, such as a “turn-left, turn right” set of walking directions, is generally higher in urban areas than in rural ones. Similarly, emergency services like an ambulance would need less accuracy to find a car accident in the countryside (where there are few roads and a position within, say 500 meters would suffice) than in the city (where 500 meters could place an ambulance on the other side of a city block). However, legal restraints by the FCC require better emergency accuracy than 50 meters independent of whereabouts, thereby almost guaranteeing the widespread implementation of network assisted global positioning system (A-GPS), which gives location to within 10 meters.

Getting a signal Positioning techniques can be either “Terminal based” if the positioning system to some extent uses the terminal as a logical entity to calculate positioning data or “network based” where the terminal is not used as a logical node to calculate positioning data.

The network-based system of cell global identity and timing advance, or CGI-TA, is one of the broader location methods, though it has very useful applications running both in test and business systems. In the overall scheme of things, from least accurate to most are the network-based uplink time of arrival (UL-TOA), and network/terminal- fusion based techniques of Enhanced Observed Time difference (E-OTD) and A-GPS.

“Fleet management systems, using a combination of intelligent software and CGI-TA,” says Ericsson’s x Swedberg, the senior market manager for Mobile Positioning, “the application can trace a vehicle with a GSM phone. The “banana shaped’ footprint that would be created by this method is interpreted by the software, which compares movements with a map and can therefore mark very precise locations – it actually works better practically than it would seem to technically.”

A-GPS is perhaps the most reliably accurate system, giving very precise location information. The assistance comes in because GPS signals are rather weak, and the information needs boosting when users are in buildings or in streets surrounded by very tall buildings.

The whole thing is a trade-off: network-based solutions can use current handsets, giving operators 100 percent penetration of devices now. But more accurate terminal-based designs will have to wait until the terminals are actually here, rendering meaningful penetration four years away.

“It’s in the operator’s interest to make sure that they get the value from network-assisted solutions,” said Jeremy Nassau, head of wireless for Netdecisions in London, “But it could still go in several directions – you may find, for example, that GPS is good enough without the network. If the penetration gets up to reasonable levels, you’ll reach the point where you don’t need the operator anymore to provide LBSs.”

A buddy system application developed by the UK-based company iProx would even tell you if there’s a friend or relative in the vicinity that might want to dine with you and if you’ll pass an ATM on the way. A platform developed by the French company Opt[e]way would, among other things, make it possible for a taxi company to convey your location to a driver via a dashboard-mounted screen, allowing him or her to arrive at your corner within minutes, greeting you by name. A nearby department store could notify you of a sale; a bookstore could tell you that your favorite author has a new masterpiece out.

“You’ll have that 3G telephone with lovely color display and high bandwidth, and location-based services are going to enable that device to be really useful,” said Ravi Kanodia, co-founder of iProx.

These scenarios involve several distinct tasks and hardware levels, and companies are taking a multitude of approaches. In fact most LBS companies today are concentrating on B2B applications that will only eventually lead to end-user services.

“There’s a variety of players trying to get a stake in the market, but the battle is well-advanced. Most of the companies already exist,” said Paris-based Thomas Gubler, who manages the 3i Group’s wireless investments in France. He added, “I don’t see any new players getting in at the platform and infrastructure level now. It’s too advanced. Still, there will be many opportunities in this field, based on technological advances.”

Network infrastructure companies like Nortel, Nokia and Ericsson provide the basic telecom equipment. UK-based Cambridge Positioning Systems – which has a cooperation agreement with Nortel and funding from 3i Group – and the American firm SignalSoft – with $2.5 million in funding – uses signals received from operators to determine a user’s geographic position.

Iprox’s Buddy System takes this geo-positioning data, feeding it into a “correlation engine that allows you to track up to millions of users, like an air traffic control system. We can keep track of Maria and her friends all the time, and tell her when she’s close to one of them. A little like instant messaging works on the Internet.” Iprox garnered $1 million in funding through Brainspark at Tornado-Insider’s Upstart Paris conference in April.

Opt[e]way’s star product, opt[e]go TopoServer, is a platform middleware for creating end-user applications and services, aimed at telecom operators, wireless ISPs and corporate mobile intranets, says company spokesman Christophe Lefort. Released in mid-November, opt[e]go uses a patented, vector-based file format to convey location-specific information – traffic, weather, nearby gas-stations, etc. The company recently raised 18 million euros from 3i Group, Morgan Stanley Dean Witter, Goldman Sachs and Part’Com.

Sweden’s is one of the few companies already offering early LBS services via WAP, SMS and the Web. Its yachtPosition combines GPS and GSM data to give captains information on nearby ports and approaching weather. The service also allows tracking open-ocean yacht races via a position map on the Internet. can, among other things, tell riders of nearby biker friends and locate brand-name part stores. MobilePositon recently got $4.9 million from Kaupthing, an Icelandic investment bank, and the Swiss investment company Qino Flagship. The US investment bank KKR invested $3.5 million.

In addition to higher bandwidth, another factor that will jumpstart LBS, according to Rahim Adatia, CEO of Lokah, is that when the market goes from GSM to 2.5G networks, it will shift from circuit-switched data transmissions to packet-switched, meaning the user will pay only for data received, not connection time, allowing him or her to stay connected indefinitely.

This raises the question of just who profits from this new technology. So far, telecom operators insist they own the positioning data, hence the lion’s share of the profits. “The main thing people are forgetting here, is who’s going to get charged,” said Adatia, a small British developer of B2B wireless software that, like Lokah, is currently making the rounds to acquire VC funding. “Business models are still evolving, both in technology and business. Most importantly, the LBS company’s relationship with operators is still evolving.”

“Some LBS companies make the assumption that they will own their customers,” Adatia says, adding, “I think that will become a big problem for them. We aggregate a billing component into our server, so if the billing arrangement changes, with the operator for instance, we can easily adjust.”

Lost in all the hype over all these nifty services is the question of privacy. What if Maria does not want her telephone company to track her everywhere she goes? “Let’s face it, the police are always going to know where you are with these phones,” said Opt[e]way’s Lefort, adding, “If that also occurred on the mass market, of course that could be a problem. Profiled services will be very important to success. It will all depend on the user’s ability to disable the service. I think the problem will be solved because the service is worth it.”

Location Based Services Find Their Niche

On Thursday, the Hong Kong mobile phone company Sunday Communications Ltd. started “Loved-Ones Radar,” allowing parents to track, within 150 meters (500 feet), the location of their cellular-toting child.

Last week in Britain, the makers of London’s black taxis began an automated service that locates a cell phone caller, identifies the taxi nearest that person and then puts the caller in direct voice contact with the driver. For this you pay £1.60 ($2.60) on top of the metered fare.

And since last month, Hong Kong residents have been able to dial a number on their mobile phones to get free SARS-related data. The service determines the caller’s location and sends a text message containing addresses of buildings within 1,000 meters that have reported severe acute respiratory syndrome infections.

Three years ago, the hype about such so-called location-based services, or LBS, was overwhelming, with operators openly threatening to beep your phone with coupon offers whenever you passed a Starbucks. Today, the arrival of LBS is relatively unheralded, even as the services are filtering into the mainstream of mobile applications.

Depending on your location, your handset and the services your mobile phone company uses, you can use a cell phone to get buzzed when a friend is nearby, play hide-and-seek-like games with friends or strangers, find apartments to rent near where you are standing or get medical help, all without opening your mouth. Services that locate the nearest towing company, automated teller machine or florist have also been popular, analysts say.

“The applications that make the cut,” said Jed Kolko, lead analyst for consumer devices at Forrester Research Inc. in California, “are simple, quick hits of information that the customer needs here and now and cannot do without a mobile phone.”

Unlike cell phone services like video calling that require new, high-speed data networks still under development, most LBS offerings work on existing networks, allowing operators to provide the services without major investment. For the consumer, the cost can vary widely from about 19 euro cents (21 U.S. cents) to E2 a message.

The simplest technology locates where your phone is – and, presumably, you – by identifying which cellular antenna is picking up and transmitting your phone’s signal. But that antenna can be as much as one kilometer (six-tenths of a mile) away.

Better software in the network essentially triangulates among antennas to determine the caller’s position to within 300 to 500 meters. Still more accurate technology uses the satellite-based Global Positioning System to locate a transmitter in the phone. But these phones cost more.

With technology not a barrier, the question in the telecom industry, then, was what services to offer. It turned out that it wasn’t going to be coupons after all.

“We launched ‘push-based’ coupon-messaging in Hong Kong shopping malls in 1997,” said Bruce Hicks, group managing director of Sunday Communications. “Its failure proved that people don’t adjust their social habits just because a technology is available. To succeed, it must be simple and complement the way that they already do things.”

“Changing social behavior takes forever,” said Swen Halling, chief executive of It’s Alive Mobile Games AB, a Stockholm-based LBS developer that makes the location-based games Supafly and BotFighters, a phone-based role-playing offering.

Many operators provide services like TeliaSonera Corp.’s Friend Finder, which allows a subscriber of the Nordic telecom to set up lists of friends and get an alert when one is nearby.

While this kind of buddy-finder service was one of the first to appear in many countries, in France the earlier priority, according to Frederic Jarjat, LBS product manager for France Telecom SA, was LBS-enabled chat – or, more specifically, “flirt,” a service whereby subscribers can exchange text messages with total strangers who are in their vicinity.

TeliaSonera also has a service with a novel twist: You’re walking down a pleasant street in Uppsala and think that it looks like a nice place to live. You send a text message on your phone to the number 4412, and you get a return message listing the nearest three available apartments – and the phone numbers to call about them.

LBS technology also leads naturally to security-based products, such as “panic buttons” for older or ill users, or a Find My Kid service like Sunday’s, which several operators are already testing.

But, in general, LBS offerings rank low on the list of what people do with mobile phones.

Operators do not publicize usage figures for any given service, but industry analysts and applications developers say LBS options make up less than 1 percent of mobile usage.

Michelle de Lussanet, telecommunications analyst at Forrester in Amsterdam, said that in terms of average revenue per user, data services as a whole make up 10 percent to 14 percent of cell phone carriers’ business. Almost 98 percent of that is person-to-person text messages.

“The remainder is mainly ring tones and operator logos,” she said. “This doesn’t leave a lot of room for other services.”

Zingo, the taxi locater that is a unit of Manganese Bronze Holdings PLC, said that of the 12,000 licensed black taxis that cruise London’s streets on any given day, 500 are outfitted with its equipment. More than 6,300 people have tried it since the beginning of the company’s trial this year.

Sunday said that in the first week of its SARS service, 16,000 of its 650,000 Hong Kong subscribers used it.

Operators and analysts say a large-scale introduction of LBS will take years; Forrester predicts that data applications like LBS will not take off until 2006 or so, when today’s teenagers become the next generation of high-tech adults.

And some services also could suffer if they are too accurate. The director of new technologies at Sunday, Henry Wong, said, “Our focus group of Hong Kong youths showed us that teens overwhelmingly don’t want the kid-finder because it doesn’t let them lie about where they are.”

Cheesy Feet & Ducks: IBM’s Voice Recognition Software

ducksThe idea of speaking into my computer and having it correctly type what I say has intrigued me since I saw the Star Trek episode Assignment: Earth, in which Gary Seven dictates to his IBM Selectric typewriter while plotting to sabotage a NASA launch.

The thought that I can now actually say – and have my computer type – the phrase, “The museum is open Monday to Friday from 9 am to 6 pm, Saturday from 9 am to 3 pm, Sunday from noon to 4 pm, closed major holidays,” makes me positively giddy – covering Disney World doesn’t look so daunting anymore.

It was with this light thought that I cheerfully set about installing IBM’s new SimplySpeaking Gold (remember: IBM made the Selectric! No one gets fired for buying IBM!), touted by Big Blue as the software that would change the world. My father was with me, and as I was describing what the software would do (‘yeah, that’s it… I can just talk into it and it will type what I say,’) he was shooting me looks of open dubiousness, if not mild derision.

“Youe’re skeptical,” I said.

“I’m not skeptical,” he said, “I know it won’t work.”

“Why,” I asked, supremely patient with my dottering dad, “would IBM offer a 30 day money back guarantee on it if it didn’t work?”

” I don’t know” said my father,” But it won’t work.”

Chuckling to myself (what does he know?) I set to installing SimplySpeaking Gold. Following the directions to the letter, I donned the little headset that came with the software. The training session lasted about half an hour, after which I started talking and it started typing.

Unfortunately, those two actions were entirely independent. It was as if had installed Tourette’sSyndrome for Windows95. I said,” Hey, look Dad, I’m talking and this thing is typing,” and it typed, “pay stark land vice talking in myths saying it is typing.” (“typing”, I noticed later, was one word it consistently spelled correctly, along with “SimplySpeaking Gold” ) I said, “This system sucks.” It typed, “cheesy feet and ducks.” Okay, it wasn’t really that bad – I am exaggerating a little (just a little) – but it was, in fact, terrible.

I returned it the following day. Later I spoke with a software salesman, who told me that almost everyone who bought the IBM software at his shop (one of New York’s largest) brought it back.

“That’s not to say it’s bad,” he was careful to say, “it’s just that a lot of people bring it back.”


This salesman went on to tell me that a lot of the people who were disappointed with IBM really liked Dragon NaturallySpeaking, but that that software was much more difficult to learn then IBM’s. Since I thought that learning IBM’s was simply a matter of training myself to speak in the manner of one of those VCR manuals that has been translated from the original Korean via Swahili, I was game for anything.

To be fair, IBM’s ViaVoice is said (well, said by IBM) to be better than SimplySpeaking. But in an article in the San Francisco Chronicle, David Einstein reported something hauntingly similar to my experience:

” …when I said, ” This is my first dictation” ViaVoice wrote ” This is mild irritation.” I repeated the sentence and it came out, ” This is missus sophistication’.

Why, that is much better!

My next test was with Dragon’s NaturallySpeaking. With doubt in my heart, I installed the software and went through its training session. One thing that struck me immediately was that while I was reading through the training session’s text (it gives you a choice of three, I chose Dave Barry’s Adventures in Cyberspace) it was recognizing my voice right out of the box.

But I was truly astounded when, after finishing the session, I was able to write a long letter with very few mistakes: this thing actually works! Don’t believe it? Come over to my house and I’ll show you (two of my neighbours are going out to buy it after one demo).

For example, I’m writing the following five paragraphs by speaking into my computer. It’s an absolutely joyous thing: I’m sitting here with my feet on my desk speaking absolutely normally and watching it type everything I say.

And okay, there are some drawbacks (like the fact that it just wrote ” arson” instead of ” all are some’, and I had to go back and correct): I sit at my desk wearing this funky headset and looking for all the world like a Time-Life operator ready to take your phone call (E’Good morning, my name is Nick, are you calling about our Sports Illustrated swimsuit issue?’).

But the fact is, I can dictate into this thing at about 100 words per minute after three days of use – and the folks at Dragon say that this will only improve over time.

I have noticed that in the last few days of using this software intensely it has made the same mistakes on a couple of occasions. But it also learns incredibly quickly. I only had to train ” Minas Gerais” and ” São Paulo” once, and never even had to tell it to recognize Rio de Janeiro. Handy, when IE’m working on Brazil (it also recognized, after training, “rodoviária” and “real”, which are pronounced decidedly not as they’re written).

But you’ve got to have patience (it just wrote ” patients’), and realize that it will take about a solid week before you begin to get close to 96% recognition.

The mistakes NaturallySpeaking made while I recited the last five paragraphs were, “good morning, my name is neck”; “… with my field on my desk”; and the aforementioned, “arson” and “patients”. Still, thatE’s not so bad. Earlier OCR scanning devices made far more mistakes, and for most of the friends of mine who can’t type to save their lives, a couple of mistakes in each paragraph is a far happier situation than a blank page.

But Naturally Speaking – or its presence – did cause some problems on my machine. After running it and other programs simultaneously, my computer crashed – but it turned out to be a Microsoft problem, and I had to download a small patch to fix it. You’ll also need a relatively good machine: while Dragon says you need at least a Pentium 133 Mhz, 32MB of RAM and 65MB of hard drive space, I’d say that’s conservative.

Another good question is whether you can dictate into a tape recorder on the road – some smarter authors (and now I) use a tape recorder for mapping (” J&R Music World on the south side of Park row 200 metres south of John St” ) and it would be a hoot to have the machine transcribe it. Well, short of spending upwards of $250 on a mini disk recorder, you’re out of luck: traditional minicassette and other analog recorders just don’t have the quality to work with NaturallySpeaking.

NaturallySpeaking has several models to choose from, but the recognition engine is the same on all – bells and whistles change as you spend more money. But their basic Point & Speak (US$59 RRP in the US) model allows you to do everything I did here. The Personal edition and Preferred Editions (US$99 and US$149 to US$159) have greater customization abilities, and very expensive Deluxe editions are available as well. SimplySpeaking Gold sells for US$139 in the US.