When did AI begin to have an impact on the video game business? If you ask Kevin Gammill, partner general manager for PlayFab at Microsoft, he’ll reach back four decades and bring up early computer-controlled opponents such as the flying saucers in Atari’s Asteroids arcade game. “I think AI has been around as long as gaming’s been around,” he says.
Tamil Melamed and Kevin Gammill [Photo: Harry McCracken]
In 2019, AI’s potential applications in gaming go far beyond making decisions for bad guys. Microsoft–the rare company with deep investments in both games and basic computer research–is well positioned to explore them.
That includes useful stuff that makes life better for game players without screaming “Hey, artificial intelligence!” Studies, for instance, have shown that online competition greatly benefits from players being matched with others of roughly comparable skill. “If you go into a game and you just get slaughtered, it’s probably not a good experience,” explains Gammill. “If everyone’s too easy, it’s also probably not a good experience.” Xbox Live has long used an algorithm called TrueSkill (recently updated as TrueSkill 2) to help ensure that contestants are neither bored nor massacred by their opponents.
Another piece of practical AI was inspired by the fact that Microsoft “heard for years loud and clear from gamers that they would greatly prefer to spend a lot more time playing games than downloading games,” says Ashley McKissick, who manages the Game Pass service. The company initially tried to let players skip ahead to the action before a download was complete through a system that required some heavy lifting on the part of game publishers and was therefore not universally adopted.
Starting last summer, Microsoft replaced this unsatisfactory handwork by humans with a piece of AI-enhanced technology called FastStart. It leverages machine learning to determine which bits of a game to download first, allowing gamers to begin playing up to twice as fast. “We’re not really changing the laws of physics here, but it does make your download much smarter,” says McKissick.
Increasingly, Microsoft is formalizing the kind of collaboration that helps AI make its way into games. Similar to the MSR/Office meeting called Roc, a confab called Magneto is designed to cultivate conversation–and outright hacking–between MSR and the gaming group. Along with those two constituencies, “there are people from Bing there, there are people from Windows there, there are people from Azure there,” says Tamir Melamed, Microsoft’s head of PlayFab engineering. “Because there’s a lot of those technologies that we think we can share down the road.”
One joint project emerged from Microsoft’s annual companywide hackathon. In 2017, the gaming group was wrestling with the challenge of curating Mixer, a game-streaming service–in the same Zip Code as Twitch, but more interactive–which Microsoft had acquired in the form of a startup called Beam. “We found ourselves with a much larger volume of streams than we had anticipated,” says Chad Gibson, Mixer’s general manager. “And so we were trying to find, ‘How can we provide new, unique ways of allowing players of PlayerUnknown’s Battlegrounds or Fortnite to be discovered?'”
Chad Gibson and Ashley McKissick [Photo: Harry McCracken]
At around the same time that the Mixer team was asking itself that question, the hackathon was being won by some Microsoft Research staffers who’d devised “Watch For,” an AI system for analyzing live video streams and identifying specific events therein. (Microsoft was so impressed by the technology’s commercial possibilities that it announced the team’s victory without disclosing exactly what it had created.) The two groups collaborated to use Watch For as the basis for HypeZone, a Mixer feature that lets viewers tune in to the most climactic moments in game streams in progress. “It allowed us to do new forms of discovery that we honestly didn’t think were possible,” says Gibson.
As long as gaming has its frustrations, AI should provide further ways to mitigate them. Recently, Gammill was engaged in heated competition in the Tom Clancy first-person shooter Rainbow Six Siege against three friends. Then one contestant’s internet connection choked. “Three of us are running around and a frozen character is standing there,” he says. And a frozen character can’t do much except be mowed down.
A better scenario would be if the game could use AI to determine that a player had gotten cut off, and then take temporary control of the corresponding character–and play in the same style as that person. “Now we’re very close to scenarios like that actually coming to fruition,” says Gammill.
THE SILICON FACTOR
Steve Jobs was fond of saying that Apple was the only computer company that built “the whole widget”–not just software or hardware, but both, integrated so well that the seams of the experience start to fade away. In recent years, that philosophy has reached its ultimate expression as Apple has even designed its own iPhone and iPad processors and optimized them for running Apple software.The same vertical integration that’s a boon for a smartphone or tablet makes sense, on a grander scale, for a data center–such as the ones that power Microsofts Azure services. Enter Project Brainwave. That’s the name for the custom hardware accelerator Microsoft has designed–using Intel field-programmable gate arrays (FPGAs)–specifically for the purpose of speeding up AI running in the Azure cloud.
Microsoft’s move into designing its own hardware for optimal AI is hardly unique. Both Google and Amazon are also moving down the stack from software to silicon for similar reasons. But Microsoft isn’t just hopping aboard a trendy bandwagon. Project Brainwave is the end product of an opportunity Doug Burger began thinking about almost a decade ago–and at first, he did it on his own. “I started the work in 2010 and then exposed it to management after about a year,” remembers Burger, who was a researcher within MSR at the time.
Project Brainwave sprang from Microsoft’s realization that embracing AI needed to start at the chip level [Photo: courtesy of Microsoft]
Conventional chips know how to execute the computing instructions in their repertoire when they leave the factory, and can never be retrained for a different purpose-such as efficiently running a new machine-learning algorithm. FPGAs, by contrast, are like chameleons, says Burger. “What the FPGAs allow us to do is build stuff really fast and get it into production, and then iterate on a very rapid cadence,” he explains. “So that chameleon is changing colors really fast and getting better every time it changes color.”
FPGA technology allows Microsoft to deliver highly efficient deep learning as a service in a way that addresses specific customer requests. “A lot of the problems that they want to solve are around image analysis,” says Ted Way, senior program manager for Azure Machine Learning. “‘I want to look at my manufacturing defects.’ ‘I want to look at whether [products are] out of stock.’ ‘I want to see if people are smoking at my gas station because I’m afraid of fires.’ Doug’s team was able to turn that around and build these convolutional neural networks that ran super fast on the FPGA in just six months or so.” By silicon standards, that’s quick.
When Burger had begun his personal investigation of FPGAs in 2010, it wasn’t clear–at least to people who aren’t prescient computer scientists–how quickly AI would go mainstream, let alone that delivering it as a service would become a strategic imperative for a company such as Microsoft. Soon enough, Microsoft understood the value his brainchild could bring to Azure. Last July, after Project Brainwave left the lab, so did Burger and his team. Today, they’re continuing their work as part of the Azure group rather than MSR.
Such a segue is not unusual. “One thing about the Microsoft culture now, that boundary between research and product has blurred quite a bit,” says Burger. “The product groups have lots of people who were formerly researchers and are developing new stuff. Research has not only people that do research but engineers building stuff. It’s more of a continuum.” Nadella, he adds, “has done a great job of pushing down this kind of innovation.”
SELF-SERVE SMARTS
With Azure, Microsoft is in a race with Amazon and Google to provide AI and other advanced computing functions to businesses of all sorts as on-demand services. That’s not just good for outside companies; there are also groups within Microsoft that can benefit from pre-packaged AI and machine learning.
Case in point: Codie, a multilingual chatbot designed to provide information about coding. An internal Microsoft experiment for now rather than a commercial product, it sprang from the realization that one major obstacle for would-be software engineers is simply having access to information about matters such as commands in the Python programming language and syntax for SQL database queries. The problem is especially acute for non-native English speakers.
Matt Fisher, senior data analytics manager for Office 365 and Microsoft 365, and one of Codie’s creators, describes the service as “Cortana’s geekier little sibling.” It emerged from the Microsoft Garage, a program that gives employees encouragement and resources as they pursue ideas they’re passionate about, whether or not they fit neatly into official responsibilities. Fifteen staffers with diverse backgrounds were on the team that created the service, including developers, designers, and marketers. It beat 767 other projects to win the company’s Redmond Science Fair, and took second place out of 5,875 entries in the company’s inclusivity challenge.
Afreen Rahman and Matt Fisher [Photo: Harry McCracken]
Using text-based input, Codie answers coding questions by drawing information from Microsoft’s Bing search engine and user-to-user tech advice site Stack Overflow. “In 48 hours we had something that was working across five different spoken languages and pulling from a huge database of information so you could ask it a coding question in Spanish and get a technical answer back in Spanish,” says Afreen Rahman, who works on the Microsoft Store in her day job as a software engineer.
Though Codie’s creators brought a variety of skills to the enterprise, none of them started out knowing that much about AI. “We used out-of-the-box tools available as part of the AI Suite that Microsoft offers,” says Rahman. “And as devs we were able to pick up the documentation in no time and just get going.”
Fisher rattles off the Microsoft cloud offerings that power Codie: “We used everything from the Azure learning service to LUIS language understanding. QnA Maker, the Bing graph, the Microsoft graph, the Azure bot framework, the Azure speech plugin.” There was plenty of Microsoft AI expertise in there; it’s just that it was in ready-to-use form. For Codie–and many other things people want to build–that’s enough.
As an effort to leverage AI as an enabling technology for an inspiring purpose, Codie is already a success. The people who built it are thinking about upgrades–one obvious one would be to let users talk rather than type–and how to make it broadly available. “Our goal is that we would like to see it be used outside of Microsoft’s walls,” says Fisher. “We’re working towards what we need to need to do to get there. We have support from this lovely group, the Garage, but this is our second or third job in many cases.”

REAL PROBLEMS, REAL RESEARCH
One other thing about Microsoft’s new approach to cross-pollinating research and products: It isn’t just the products that benefit. AI has an insatiable appetite for the sort of data required to train machine-learning algorithms. Microsoft, as one of the largest tech companies in the world, has that data, in anonymized form, by the metric ton. Which means that if there was a time when its research efforts benefited from being walled off from money-making businesses that serve actual human beings, it’s over.
“Nowadays, to do a lot of very exciting AI research, you need to get access to real problems and you need to get access to data,” says Shum. “This is where you work together with [product teams]. You build a new model, you train the new model, and then you tweak your new model. Now you have advanced your basic research further. And along the way, you never know–you could get a breakthrough.”




SM Tech Trading LLC business fully realize the potential of technology-enabled products and services helping them maximize the value of the technology and service that they use. Buy cheap Ink Cartridges
ReplyDelete