Increasing generality in machine learning through procedural content generation

We recently published a paper on increasing generality in machine learning through procedural content generation:

https://www.nature.com/articles/s42256-020-0208-z

if (paywall) then (preprint):

https://arxiv.org/abs/1911.13071

Outside of games and game-like environments, what are some ways you think we can use PCG and PCG-like methods to help fight overfitting and create more general intelligence?

7 points | by togelius 1359 days ago

2 comments

  • browsergap 1359 days ago
    Looks cool!

    Meta: How is this under "Ask"? Are these paid submissions?

    Answer:

    - Generate a grammar of a natural language. Generate against a grammar + other models to generate text/code. - Generate proteins/RNA

    - Generate lattices/materials

    - Generate objects in pure maths: knots, planar graphs, polynomial rings, solutions to diophantine equations, error-correcting codes

    - Generate "neural nets" deep learning graphs, and see what they do, and see how they work as starting points for different tasks/trainings

  • muzani 1359 days ago
    I'm not sure if this counts as "game-like", but what I'm doing is a random character/story generator: https://random-character.com/

    Many people think it's game-like, but the purpose is to stir ideas like a deck of cards to help writers (game designers, movie scripts, authors) out of writer's block.

    My next plan is to have a part done with PCG, and another part done with some kind of language model. For example, the next stage I might take a popular dramatic pattern - the Three Act Structure, consisting of [setup] [confrontation] [resolution].

    [setup] might generate 1-2 dramatic elements. [confrontation] might have 4-6. [resolution] would be one, though exclude <deliverance/rescue> as well as some bad ends like <disaster> or <cruelty>.

    So let's say I randomly roll up <murderous adultery> as the intro. This consists of say, <villainous wife>, <victim husband>, and <anti-villain murderer/lover>. In resolution, this could spawn into other plots like <revenge for a crime>, <supplication>, or <fugitive>.

    I can then generate characters based off this. Rolling off my existing generator, I have 1) <female> <villain> <damsel in distress> <blatant liar> 2) <male> <anti-villain> <adorable> <right cause> <wrong side effect>, and 3) <male> <support> <big> <stoic> <loyal>

    All procedurally generated. Now we have a pile of tags. There's some hard coded descriptions on the current prototype, and while they work, 80% of the development time and the output quality has been in the writing and display of these tags. What I'd like to do is feed these tags into some natural language model to generate a description, and maybe even string together a plot in a way a human wouldn't.

    tldr: I use PCG and popular patterns to generate plot curve, characters, then feeding it into something like GPT-3 to illustrate them.