A trivial Neural Network in DALIS

Recently I've been working on adding arrays and functions to my DALIS programming language. To test this stuff out I decided to write a very simple Artificial Neural Network. I've hard-coded the weights between a 5 neuron feed-forward 3 layer network. It's a classic XOR example with 2 inputs and one output. The output should be 1 if the two inputs are different, otherwise the output will be zero. This is the output from the program:

Trivial Neural Network Example
Input 0,0 Output 1 is: 0
Input 1,0 Output 1 is: 1
Input 0,1 Output 1 is: 1
Input 1,1 Output 1 is: 0

...and this is the DALIS source code that I used:

:<b>Trivial Neural Network Example</b><br/>
ann(a=0, b=0)
ann(a=1, b=0)
ann(a=0, b=1)
ann(a=1, b=1)

  sizeL1=2, sizeL2=2, sizeL3=1
  L1[sizeL1]=0, L2[sizeL2]=0, L3[sizeL3]=0
  W2[sizeL1*sizeL2]=0, W3[sizeL2*sizeL3]=0
  W2[1]=2, W2[2]=-1, W2[3]=-1, W2[4]=2
  W3[1]=2, W3[2]=2

  L1[1]=a, L1[2]=b
  WRITE "Input "+TEXT(a,0)+","+TEXT(b,0)

  LOOP j=1, size = sizeL2
    LOOP i=1
      L2[j] = L2[j] + L1[i] * W2[w]
    REPEAT i=i+1 IF i<=sizeL1
    L2[j]=output(x=L2[j], threshold=2)
  REPEAT j=j+1 IF j<=sizeL2

  LOOP j=1, size=sizeL3
    LOOP i=1
      L3[j] = L3[j] + L2[i] * W3[w]
    REPEAT i=i+1 IF i<=sizeL2
    L3[j]=output(x=L3[j], threshold=2)
    WRITE " Output "+TEXT(j,0)+" is: "+TEXT(L3[j],0)+EOL
  REPEAT j=j+1 IF j<=sizeL3


FUNCTION weightindex()
VALUE = (f*s)+b-s

FUNCTION output()
  IF x>=threshold
VALUE = retval

So I'm now starting to think about genetic programming in DALIS, there's no real reason why DALIS programs couldn't write other DALIS programs and run them by posting them to the server.