This is a little script that uses Python to generate the cellular automata that Wolfram discusses in his book "A New Kind of Science". The script uses the Python Imaging Library to render the output, but you could replace this with text or any other method of visualization.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
#!/usr/bin/env python """ CellularAutomata.py: Wolfram-style cellular automata in Python Options: -h # Use a screen of height # for the simulation -w # Use a screen of width # for the simulation -r Use a random initial row (rather than the standard single 1 in the middle) -R # Use rule # for the simulation """ import getopt,sys from random import randint def ca_data(height,width,dorandom,rulenum): # Create the first row, either randomly, or with a single 1 if dorandom: first_row = [randint(0,1) for i in range(width)] else: first_row = *width first_row[width/2] = 1 results = [first_row] # Convert the rule number to a list of outcomes. rule = [(rulenum/pow(2,i)) % 2 for i in range(8)] for i in range(height-1): data = results[-1] # Determine the new row based on the old data. We first make an # integer out of the value of the old row and its two neighbors # and then use that value to get the outcome from the table we # computed earlier new = [rule[4*data[(j-1)%width]+2*data[j]+data[(j+1)%width]] for j in range(width)] results.append(new) return results def pil_render(data,height,width,fname="bs.png"): import Image, ImageDraw img = Image.new("RGB",(width,height),(255,255,255)) draw = ImageDraw.Draw(img) for y in range(height): for x in range(width): if data[y][x]: draw.point((x,y),(0,0,0)) img.save(fname,"PNG") return def main(): opts,args = getopt.getopt(sys.argv[1:],'h:w:rR:') height = 500 width = 500 dorandom = 0 rule = 22 for key,val in opts: if key == '-h': height = int(val) if key == '-w': width = int(val) if key == '-r': dorandom = 1 if key == '-R': rule = int(val) data = ca_data(height,width,dorandom,rule) pil_render(data,height,width) return if __name__ == '__main__': main()
Stephen Wolfram's book "A New Kind of Science" is a fun read, and the pictures that his 1-d cellular automata generate are quite compelling. This is a simple little script that uses Python to generate these images. I use the Python Imaging Library to generate images from the data, but you could substitute any number of other methods. Much of the original work on cellular automata (Conway's game of life) was done using text output, and you could easily image "+" and " " (space) for the 1/0's here.
This script uses lists of integers to hold the rules; there's probably a way to speed the code up by using bit arithmatic, and I would welcome someone enlightening me about how this would be done.
Bitwise Versions. First, thanks for the recipe, Rick. While I have not read A New Kind of Science, I had the chance to see Wolfram lecture in support of the book and found myself amused and intrigued. I am eager to sit down and see what I can do with your code. That being said, here are bitwise arithmetic versions of the ca_data and pil_render functions.
<pre width="80"> def ca_data_bwa(height,width,dorandom,rulenum):
def pil_render_bwa(data,height,width,fname="bs_bwa.png"): import Image, ImageDraw img = Image.new("RGB",(width,height),(255,255,255)) draw = ImageDraw.Draw(img)
(...continued from previous comment)
Unfortunately, bitwise ops are rather slow under Python, so the bitwise versions take roughly 6 times as long to run on my machine. So, no speed optimization for me, though your mileage may vary. I tried to optimize with various different approaches to bit shifting, but found that the << (left shift) operator was almost universally faster than pow and a number of other approaches.
On another dimension, these routines may provide an optimization. While I have not yet investigated the matter thoroughly, I suspect that the bitwise versions probably perform better than the originals with regard to memory consumption. While I am ignorant of the actual memory representation of Python long integers, it seems that, unless they are grossly inefficient, a list of arbitrarily long Python long integers must take up less space than an equivalent representation using a list of lists of 32-bit integers. Of course, the phrase "it seems reasonable that" often has very little to do with actual performance, so it would be nice if someone would check this out.
Regardless of their efficiency, these alternate versions should serve as basic examples of bit arithmetic in Python. With the exception of the randomization function (where using long integers opened up obvious optimizations too good to pass up) I have attempted to keep the flow of the functions as similar as possible. The comments are a bit sparse, so, should anyone be really puzzled, I can clarify further.
profiled and speedy. I got fascinated by Wolfram's 1d automata experimentation as well. I went back to the math he presented in one of his papers, and came up with this implementation, which I profiled and tweaked until I got it as fast as I could make it without dropping into C:
step() is where all the real action occurs. That's where most of the tweaking happened. And while bitwise manipulation is slow, basic math seems to be fast.
Nice. Thanks, Dan. Go list comprehensions.
Er, Dave, sorry.
I came up with this shorter and faster implementation.
Time to compute Rule 110, 2500x2500:
1.9 s vs 8 s for David's solution (including image write)
0.9 s vs 6.8 s for David solution (without image write)