Poker-AI.org

Poker AI and Botting Discussion Forum
It is currently Mon Nov 13, 2023 2:04 pm

All times are UTC




Post new topic Reply to topic  [ 4 posts ] 
Author Message
PostPosted: Wed Jun 26, 2019 4:18 am 
Offline
Junior Member

Joined: Wed Jun 26, 2019 4:02 am
Posts: 28
I'm trying to create a solver that's like piosolver/simplepostflop/monkersolver/gto+/gtorb. Currently I'm building a game tree with the following specifications:

two players, each with 100% range
flop is specified
heavy action abstraction:
* two bet sizes allowed on each street
* one raise size allowed on each street
no card abstraction

I have chance nodes that deal turn and river cards, but I don't have chance nodes for dealing hole cards. I figure it would just be better if each decision node had a dictoinary that mapped hole card combinations to regret sums and strategy sums.

Anyways, this is the game tree I'm getting:
21 flop decision nodes w/ average of 2.80 edges per node
9310 turn decision nodes w/ average of 2.66 edges per node
2,417,807 river decision nodes w/ average of 2.54 edges per node

The game tree fits in memory if I don't include regret sums and strategy sums. The issue is that I need to store this many bytes for regret sums and strategy sums:

2,417,807 river decision nodes * 2.54 actions per node * 1326 hole card combinations * 8 bytes (4 bytes for regret sum and 4 bytes for strategy sum) = 65 GB.

Even if I could get away with 2 bytes for regret sum and strategy sum I'm still looking at 33 GB.

Even if I use CFR+ and don't store average strategy (regret sum converges to nash equilibrium), I'm still looking at 16 GB.

The same tree in piosolver only takes up 12.5 GB. Granted, piosolver doesn't use cfr, but I've tried the other programs and they're all using significantly less memory.

One option is to store regret sums and strategy sums on the disk and decompress/compress on the fly during iterations, but I don't think any of the other solvers are doing this because I tried them all and they weren't writing/reading to disk according to windows task manager.

Any ideas for fitting regret sums and strategy sums in memory?


Top
 Profile  
 
PostPosted: Wed Jun 26, 2019 9:42 am 
Offline
Site Admin
User avatar

Joined: Sun Feb 24, 2013 9:39 pm
Posts: 642
On the river there are only about 75 unique hand ranks on average. Maybe you could use this observation to implement some sort of lossless hand abstraction.

The CFR+ papers discuss various opportunities for compression. Could they be employed on the fly and entirely in memory?


Top
 Profile  
 
PostPosted: Wed Jun 26, 2019 2:07 pm 
Offline
Junior Member

Joined: Wed Jun 26, 2019 4:02 am
Posts: 28
spears wrote:
On the river there are only about 75 unique hand ranks on average. Maybe you could use this observation to implement some sort of lossless hand abstraction.

The CFR+ papers discuss various opportunities for compression. Could they be employed on the fly and entirely in memory?


Could you explain what you mean by the 75 unique hand ranks? Can your 1326ish hands on the river usually be bucketed into 75 buckets where each bucket has the same equity or something? The only issue I see with that is it wouldn't take into account card removal effects.

I think because in CFR+ negative cumulative regrets are reset to 0, they are able to do compression more easily because you can compress something like 0, 0, 0, 0, 0, 0, 0, 56, 54 to 7, 0, 1, 56, 1, 54, though I'm sure it's more sophisticated than that. I was thinking of using a form of CFR+, though I'm wondering how much compression/decompression will slow things down.


Top
 Profile  
 
PostPosted: Wed Jun 26, 2019 2:41 pm 
Offline
Site Admin
User avatar

Joined: Sun Feb 24, 2013 9:39 pm
Posts: 642
Quote:
Could you explain what you mean by the 75 unique hand ranks? Can your 1326ish hands on the river usually be bucketed into 75 buckets where each bucket has the same equity or something?


Yes. 7 card hands can be given a number, the hand rank, between 1 and 7462 representing the strength of the hand, with 7462 being a royal straight flush. But for a given 5 card board there only 75 distinct hand ranks on average, and a maximum of about 140 as far as i know. On nut boards there is 1 distinct hand rank.

Quote:
The only issue I see with that is it wouldn't take into account card removal effects.
Well spotted. You might be able to ignore card removal effects when the distributions are not too uneven. When the distributions are very uneven there might be hands with effectively zero probability that could be pruned out. There might be ways to deal with card removal effects - I dunno, I haven't studied it for long. Card removal effects are expensive to model properly anyway.

Quote:
I think because in CFR+ negative cumulative regrets are reset to 0, they are able to do compression more easily because you can compress something like 0, 0, 0, 0, 0, 0, 0, 56, 54 to 7, 0, 1, 56, 1, 54, though I'm sure it's more sophisticated than that. I was thinking of using a form of CFR+, though I'm wondering how much compression/decompression will slow things down.


All agreed. There is often a trade-off between speed and memory. The paper talked about sorting boards too for more compression opportunities IIRC.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 4 posts ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Powered by phpBB® Forum Software © phpBB Group