Total Pageviews

Tech Seeds That Federal Money Can Plant

LUIS VON AHN, a computer scientist at Carnegie Mellon University, sold one Internet start-up to Google in 2009, and is now on to another. With the new company, Duolingo, he hopes to tap the millions of people learning languages online to create a crowdsourced engine of translation. “We want to translate the whole Web into every major language,” Mr. von Ahn says.

Ambitious, sure, but Duolingo recently attracted $15 million of venture capital. The investors are betting on Mr. von Ahn, his idea and his growing team of 18 engineers, language experts and Web designers.

Mr. von Ahn, 33, personifies some of the essential ingredients of America's innovation culture, when it works well. An immigrant from Guatemala, he has intelligence and entrepreneurial energy to spare. And he has received a helping hand from the federal government. Duolingo began as a university research project financed by the National Science Foundation.

That pattern has been repeated countless times over the years. Government support plays a vital role in incubating new ideas that are harvested by the private sector, sometimes many years later, creating companies and jobs. A report published this year by the National Research Council, a government advisory group, looked at eight computing technologies, including digital communications, databases, computer architectures and artificial intelligence, tracing government-financed research to commercialization. It calculated the portion of revenue at 30 well-known corporations that could be traced back to the seed research backed by government agencies. The total was nearly $500 billion a year.

“If you take any major information technology company today, from Google to Intel to Qualcomm to Apple to Microsoft and beyond, you can trace the core technologies to the rich synergy between federally funded universities and industry research and development,” says Peter Lee, a corporate vice president of Microsoft Research. Dr. Lee headed the National Research Council committee that produced the report, titled “Continuing Innovation in Information Technology.”

The long-term importance of government-supported research may loom small in the current debate over how to reduce the federal deficit. But it is an economic issue worth keeping in mind, and one that points to the kinds of tough choices and trade-offs facing policy makers.

The Budget Control Act, which is scheduled to take effect next January unless Congress shifts course, calls for across-the-board cuts in discretionary spending - for programs other than entitlements like and . A new study by the American Association for the Advancement of Science estimates that federal spending on research and development would be trimmed by more than $12 billion in 2013. The National Science Foundation, which finances most government-supported computer science research at universities, would have its budget cut by more than $450 million.

Last week, Senate leaders were trying to negotiate a deficit-reduction deal that would avoid the automatic cuts, but programs like those that finance research are likely to receive rigorous scrutiny from budget-cutters for years into the future. Already, the Advancement of Science report says, government spending on research and development has declined by 10 percent since 2010, when adjusted for inflation.

YET why should government support for scientific research and technology development be spared from the belt-tightening? Unless society benefits inordinately from such spending, there is no case for special treatment. In a new book, “Innovation Economics: The Race for Global Advantage” (Yale University Press), Robert D. Atkinson and Stephen J. Ezell forcefully present the argument for the exceptional role that science and technology play in the economy.

Cutting funding for research and development, Mr. Atkinson said in an interview, is “completely shortsighted.” Spending on science and technology, he said, is an investment that produces a larger economy in the future - generating wealth, jobs and tax revenue. Besides, he said, the bill is not very high in the United States. America ranks 22nd among 30 nations in university funding and research and development funding as a share of the economy's gross domestic product.

In their book, Mr. Atkinson, president of the Information Technology and Innovation Foundation, a nonprofit policy research group, and Mr. Ezell, a senior analyst at the foundation, define innovation as not only the generation of new ideas but also as their adoption in new products, processes, services and organizational models. In their view, the goal of policy should be to invest in and nurture the development of the innovation pipeline, from basic science to commercialization.

That would call for a more hands-on role for government than is embraced by the mainstream of economic thought, certainly in the United States. The consensus of most economists is that basic science is a “public good,” with the benefits widely shared by society, and thus a worthwhile recipient of government financing. But technology - the application of science to real-world problems - is regarded as a “private good,” with its development best left to the marketplace.

Mr. Atkinson, whose Ph.D. is in city and regional planning, says he is presenting the case for loosening the grip of “neoclassical economists” on policy. He has held economic and technology policy jobs in Rhode Island and for Congress, and has served in advisory groups in the administrations of President Obama, Bill Clinton and George W. Bush. Mr. Atkinson's nonprofit policy research organization receives financial support from groups including the Alfred P. Sloan Foundation and the Ewing Marion Kauffman Foundation, and from corporations including Intel and I.B.M.

A linchpin of innovation policy, according to Mr. Atkinson, is collaboration between government and industry. As a prime example, he points to Germany and its network of 60 Fraunhofer Institutes, financed 70 percent by business and 30 percent by federal and state government. The institutes, he says, perform applied research intended to translate promising technologies, from polymer research to nanotechnology, into products. These tech-transfer clusters, Mr. Atkinson says, are an important reason for the strength of Germany's manufacturing sector, even though wages for its factory workers are 40 percent higher than those for American workers.

Taking a page from the German model, the Obama administration announced this year plans for up to 15 manufacturing innovation institutes, public-private collaborations called the National Network of Manufacturing Innovation. The first will be in Youngstown, Ohio, specializing in custom manufacturing using 3-D printing technology. Mr. Atkinson says that while this is a good step, what is needed is a long-term commitment. He noted that Germany's Fraunhofer initiative began nearly four decades ago, and has grown steadily.

Mr. Atkinson's focus on research investment, and on the pathways that bring ideas into the marketplace, is gaining increasing attention in economics, according to David B. Audretsch, an economist at Indiana University. “Research and development unequivocally pays off economically,” Mr. Audretsch says. “But the biggest payoff is in ecosystems that take the innovative inputs and make them commercial outputs - products.”

MR. VON AHN'S start-up is a new addition to the growing technology cluster around Pittsburgh, with Carnegie Mellon, a major research university, as its hub. Duolingo combines crowdsourcing and computing, language learning and online translation. People learn a language at no cost, while their lessons feed into Duolingo's fast-expanding translation database, which is sorted and scored for accuracy with smart software.

The crowdsourcing principle is similar to that used by Mr. von Ahn's previous company, reCaptcha. Its service employs the widely used Web security feature in which users identify and type words, scanned from old books and newpapers, to gain access to a site or service - in the process helping to accurately digitize those old texts.

But Duolingo is far more ambitious and complex. The exploratory research was supported by a five-year grant from the National Science Foundation, about $120,000 a year, used mostly to pay the tuition and living expenses for a graduate student. Two years later, Mr. von Ahn and his small team had made enough progress to have a working prototype, start the company and attract private investors.

“The grants are really helpful when you're working on something that is scientifically complex,” Mr. von Ahn says, “when you're in the early year, or two or more, when you have no idea if it will work.”

Mr. von Ahn, joining other scientists, has made a few trips to Washington to speak to members of Congress and their staffs. His message is this: “Don't eat your seed corn. It may seem like an easy thing to cut now. But years later, you will regret that you did not invest a tiny portion of your in research, in the future.”



Gaming Faces Its Archenemy: Financial Reality

Sony Computer Entertainment

An image from the downloadable PlayStation game Journey.

NOT long ago the creators of video games were declaring their medium the art form of the 21st century. Games could aspire to the drama and spectacle of movies but would captivate society with their irresistible interactivity.

More than 200 million Wii, Xbox 360 and PlayStation 3 systems were sold worldwide. Sales of portable gaming machines surged as well. Upward of 12 million subscribers were paying $15 a month to play the online game World of Warcraft, and competitors were plotting to develop worthy rivals. The motion-sensing Kinect system from Microsoft generated considerable buzz, with its promise of freeing players from having to push buttons and wave wands.

And yet the gaming world has found itself teetering at the edge of a financial cliff. In the first eight months of this year retail sales of video games plummeted 20 percent in the United States. That followed a lackluster performance in 2011, when sales fell 8 percent. An analysis on the Web site Gamasutra this year said it was possible that 2012 would be the worst year for retail video game software and hardware sales since 2005.

The struggling economy has certainly been a factor in the decline, especially considering that young men - long a core audience for games - were hit so hard during the recession. Another development will sound familiar to anyone who once had a groovy record collection: the democratizing, disrupting effect of less expensive digital downloads has changed the business model. Nearly everywhere, it seems, people have been sharing Words With Friends, slinging Angry Birds at pigs or springing their creatures through a precarious Doodle universe. All those games, made for smartphones, sure are popular, and the financial picture improves when their sales are included, but they can be had for pennies and seemingly become disposable almost as fast as they are released.

The video-game industry barely survived the brutal recession of the early 1980s: 29 years ago this fall Atari buried millions of unsold video games - believed to be mostly copies of Pac-Man and E.T. The Extra-Terrestrial - in a New Mexico landfill. Are video games facing another devastating crash? Have developers been putting out inferior work, or is something beyond their control going on? What should they do to adapt? The company credited with saving the industry last time was Nintendo, which finally plans to introduce its new Wii U, the successor to the 2006 Wii, next month. Can Nintendo lead the way again?

To try to get a handle on some of these issues, two video-game critics - Chris Suellentrop, deputy editor of Yahoo News, and Stephen Totilo, editor in chief of the gaming site Kotaku.com - recently discussed the challenges facing the industry.

STEPHEN TOTILO This has been a year of underachievement for many of gaming's top achievers. How very 2012 it was for a game like Draw Something to capture the world's attention in February; attract about 14 million players a day in April; seduce the FarmVille company Zynga to buy the game's maker, Omgpop, for $180 million; and by the end of the month have its daily player base fall to 10 million daily. How very 2012 it was for the vaunted hit-maker Blizzard to release a game, Diablo III, that was 11 years in the making and then have to repeatedly apologize for its shortcomings. The Kinect might be selling Xboxes, but it isn't helping sell that many games, because there are hardly any Kinect games that anyone talks about and very few that sell. It's just a watered-down repeat of the Wii phenomenon.

This has been the year of sinking game company stocks, stagnating console sales, creative miscues from some of the medium's best creators and a lack of many blockbuster games - from big companies. Note those last three words. It has been a very bad year for corporate video games. You know, gaming's elite.

CHRIS SUELLENTROP Yes, it's been a bad year for games that require the purchase of a physical disc with cover art and liner notes - I mean, an instruction booklet - an oddly retro aspect of the medium. And to take the baton you're offering, yes, 2012 has been a remarkable year for downloadable titles, many of them created by independent developers working outside the traditional studio system. I wouldn't call three of the year's best games - the downloadable Journey, Fez and Papo & Yo - representative of gaming's peasant class. Still, I don't envision the next title from thatgamecompany, the developer behind the artful, downloadable PlayStation games Flower and Journey, making up for the industry's 30 percent revenue decline.

Besides, do you really think that the quality of individual titles is the cause of this collapse? The nation is facing nothing less than a fiction crisis. Four of the five best-selling books last year on Amazon were works of nonfiction, and the fiction title, “Mill River Recluse,” was a Kindle download. The theatrical box office recently saw its worst weekend in 10 years. Narrative television - the quality of shows like “Breaking Bad” and “Mad Men” notwithstanding - is in decline. The most-watched shows are sports and reality spectacles. Anyone who has engaged in the make-believe required for most video games to work their magic knows that games are fiction too. Why would games be immune?

TOTILO Because video games aren't all narrative fiction. Apologies to fans of the interactive storytelling pioneers of BioWare, the studio behind Mass Effect, and to those still searching for Bowser's motivation for repeatedly kidnapping Princess Peach, but few people play video games for the story. Or for the acting. Or for many of the other cinematic aspects that can't mask a bad game.

On the subway I ride daily the only video-game-related decline is the tilting down of heads so people can see the narrative-free games on their cellphones. These people could, of course, be reading books or watching movies. Many of them are not. They have an appetite for the interactivity of a game. They want to poke at a system and have it, or an opposing gamer, respond. They want to play.