AI Poker

CMU developed an AI which was allowed to learn poker, and then scheduled to play against some pros.  It did OK:

The team of computer scientists behind Claudico said its matchup against four of the world’s top poker pros ended in a statistical tie.

But the human poker pros were $732,713 ahead of Claudico when they ended their 80,000 hands of no-limit hold’em against the computer. And poker aficionados are crying foul.

If this had been a real competition, one poker fan tweeted, CMU would have “one broke-a** robot” on its hands. Another said the claim of a “statistical tie” was  “disingenuous.”

At Part-Time Poker, freelance writer and game designer Alex Weldon writes the truth is somewhere in the middle:

“What’s actually going on here is the standard clash of cultures between academia and other walks of life …

“What the researchers mean when they call the results a ‘statistical tie’ is this: Assuming that Claudico was in fact equal to the human players, the results still would have come about by chance some percentage of the time. If that percentage is greater than the margin of error that the researchers set out in advance, then they can’t call the results meaningful. A standard margin of error is 5%, and I’ve confirmed with Carnegie Mellon that this is what the researchers were shooting for.

“Thus, when the researchers claim a statistical tie, what they really mean is that they can’t say with more than 95% confidence that the humans were actually better. The rest of us probably don’t need to feel quite that confident in order to be happy calling it a human win, and that’s all that this boils down to.”

NewScientist ( 7 May 2015 ) (paywall) notes:

Computers have a few edges over humans, says graduate student Noam Brown, part of the team behind Claudico. For example, a computer can switch randomly between various betting strategies, which may confuse human opponents.

On the other hand, Claudico is slow to pick up on and adapt to people’s playing styles – something that many pro players do with ease. “One of our big concerns is that the human will be able to identify weaknesses that Claudico has and exploit them,” says Brown.

Because Claudico taught itself to play, even the team that built it don’t quite know how it picks its moves. “We’re putting our faith in Claudico. It knows much better than we do what it’s doing.”

And the official scoreboard from the Rivers Casino is here.

Each of the four pros will play 1,500 hands per day against Claudico over 13 days, with extra hands played on the last day, Thursday, May 7, to achieve a total of 80,000 hands. The pros will play using standard laptop computers. Their laptops will be linked to a computer at Carnegie Mellon University which is running the Claudico program.

Two pros will play on the casino main floor and two will play in an isolation room on the second floor. To reduce the role of luck, the pros in the isolation room will play the opposite hole cards against Claudico from the ones being played by the pros and Claudico on the main floor. The players will rotate periodically between the main floor and isolation room.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.