Skip to main content

fun with encog

Out of the possibility that my computer may be able to see
through the noise of the market and accurately predict the
next bar..appears to be beyond the meager processing power
my computer currently has.

Here is the source code to a stock predictor program I wrote
which uses neural networks with temporal or time based data
to predict the next bar in the time series.

the raw data...

1,973,971.75,971.75,973.75,23537,976.88,970.62,973.75,8.33,22.22,45.45,20.83,20.83,23537,22732
2,971.75,970,970,972,6350,977.1,969.79,973.45,7.41,29.48,52.64,0,13.51,6350,13589
3,970,968.25,968.25,970.25,9316,977.38,968.55,972.96,6.58,35.92,58.1,0,6.02,9316,14281
4,968.25,966.25,966.25,968.25,4452,977.67,966.94,972.3,5.85,43.04,62.59,0,0,4452,5296
5,966.25,968.25,966.25,968.25,10861,977.5,966.25,971.88,5.2,38.26,65.29,19.51,6.96,10861,5123
6,968.25,967.75,967.75,969.75,19762,976.7,965.83,971.27,12.96,34.01,61.87,14.63,11.38,19762,14855
7,967.75,969.75,967.75,969.75,16617,976.1,965.76,970.93,11.52,30.23,59.44,35.9,23.14,16617,14083
8,969.75,971.5,969.5,971.5,8973,975.65,965.85,970.75,19.96,26.87,53.85,56.76,35.04,8973,3416
9,971.5,970,970,972,22079,974.5,966.22,970.36,20.52,23.89,48.71,40.54,44.25,22079,22481
10,970,968.25,968.25,970.25,4454,973.86,966.11,969.98,18.24,30.95,46.17,26.67,42.31,4454,9171
11,968.25,967.25,967.25,969.25,13346,973.64,965.68,969.66,16.21,33.07,44.84,13.33,27.84,13346,12591
12,967.25,965.5,965.5,967.5,3536,973.75,964.83,969.29,14.41,39.12,44.99,0,12.9,3536,8745
13,965.5,964.5,964.5,966.5,12983,973.42,964.01,968.71,12.81,40.33,45.74,0,4,12983,18985
14,964.5,966,964,966,17541,972.46,963.97,968.21,11.39,38.62,46.71,20.51,7.34,17541,15404
15,966,967.25,965.25,967.25,13037,971.68,964.11,967.89,17.07,34.33,45.25,40.63,19.44,13037,8246
16,967.25,965.25,965.25,967.25,4520,971.37,963.74,967.55,15.17,30.52,43.96,15.63,25.24,4520,9550
17,965.25,966.25,964.25,966.25,11435,971.26,963.56,967.41,13.48,32.68,43.69,28.13,28.13,11435,10845
18,966.25,964.75,964.75,966.75,4687,971.36,963.25,967.3,14.76,29.05,42.46,9.38,17.71,4687,8352
19,964.75,966,964,966,11627,971.21,963.07,967.14,13.12,29.99,42.09,25,20.83,11627,10266
20,966,964.5,964.5,966.5,7755,971.18,962.64,966.91,14.44,26.66,40.72,6.25,13.54,7755,10491
21,964.5,966.25,964.25,966.25,9107,970.64,962.68,966.66,12.84,25.08,39.78,28.13,19.79,9107,7545
22,966.25,968,966,968,12293,969.47,963.35,966.41,21.13,22.3,35.66,50,28.13,12293,7340
23,968,966.25,966.25,968.25,13619,968.47,963.82,966.14,20.17,19.82,31.79,36,38.2,13619,14963
24,966.25,965,965,967,2760,967.99,963.84,965.91,17.93,24.56,29.99,19.05,37.18,2760,7288
25,965,966.5,964.5,966.5,14754,967.83,963.89,965.86,15.94,24.61,29.04,58.82,36.51,14754,16066
26,966.5,964.5,964.5,966.5,3754,967.87,963.7,965.79,14.17,21.88,28.19,11.76,29.09,3754,6309
27,964.5,962.75,962.75,964.75,17381,968.2,963.12,965.66,12.6,29.17,29.46,0,21.43,17381,15998
28,962.75,964.5,962.5,964.5,20715,968.15,962.95,965.55,11.2,27.32,30.84,34.78,16.13,20715,18173
29,964.5,965,963,965,17395,967.83,962.96,965.39,12.73,24.28,30.88,43.48,26.47,17395,15945
30,965,966.5,964.5,966.5,17180,967.98,962.99,965.48,19.65,21.58,27.97,69.57,49.28,17180,7791
31,966.5,968.25,966.25,968.25,27704,968.48,962.77,965.63,27.19,19.18,26.78,100,71.01,27704,20235
32,968.25,969.25,967.25,969.25,36771,969.31,962.59,965.95,29.72,17.05,26.82,100,90.41,36771,31466
33,969.25,971,969,971,14368,970.56,962.05,966.3,36.14,15.16,28.38,100,100,14368,7738
34,971,969,969,971,33126,970.96,962.29,966.63,32.13,13.47,29.77,76.47,91.58,33126,23333
35,969,970.75,968.75,970.75,14035,971.77,962.13,966.95,28.56,13.37,30.49,97.06,91.18,14035,9645
36,970.75,969,969,971,11155,971.93,962.11,967.02,26.77,11.88,31.38,76.47,83.33,11155,14258
37,969,970.75,968.75,970.75,22082,972.58,962.1,967.34,23.8,11.95,31.58,97.06,90.2,22082,13612
38,970.75,971.75,969.75,971.75,16710,973.35,962.29,967.82,26.71,10.62,32.86,100,91.43,16710,12423
39,971.75,973.25,971.25,973.25,11437,974.43,962.18,968.3,32.07,9.44,35.26,100,99.12,11437,3860
40,973.25,974.5,972.5,974.5,23000,975.52,962.51,969.02,35.46,8.39,38.2,100,100,23000,17255
41,974.5,974.75,972.75,974.75,40920,976,963.75,969.88,32.9,7.46,40.96,100,100,40920,34714
42,974.75,973,973,975,13201,976.02,964.95,970.48,30.64,6.63,43.57,83.33,94.48,13201,17222
43,973,974.25,972.25,974.25,26557,976.08,966.21,971.14,27.23,10.06,43.84,92.86,92.09,26557,20605
44,974.25,972.75,972.75,974.75,8998,975.85,967.33,971.59,26.99,8.94,44.55,74.29,84,8998,7437
45,972.75,973.75,971.75,973.75,9698,975.94,968.02,971.98,23.99,13.51,42.71,83.87,84.26,9698,12787
46,973.75,974.5,972.5,974.5,18086,976.2,968.51,972.36,25.49,12,41.96,92,82.42,18086,12323
47,974.5,973.25,973.25,975.25,10064,976.31,968.73,972.52,26.82,10.67,42.08,69.23,81.71,10064,11397
48,973.25,974.5,972.5,974.5,18020,976.28,969.54,972.91,23.84,13.65,40.43,88.46,83.12,18020,17466
49,974.5,976,974,976,20110,976.77,969.8,973.29,29.53,12.13,40.57,100,86.42,20110,14506
50,976,977.25,975.25,977.25,40864,977.04,970.71,973.88,33.19,10.79,41.73,100,96.63,40864,24850
51,977.25,977.75,975.75,977.75,36285,977.62,971.13,974.38,32.28,9.59,43.11,100,100,36285,30288
52,977.75,978.5,976.5,978.5,21972,978.39,971.32,974.86,32.86,8.52,44.86,100,100,21972,22590
53,978.5,977,977,979,1887,978.7,971.55,975.13,31.99,7.58,46.73,72.41,91.11,1887,937
54,977,975,975,977,0,978.72,971.6,975.16,28.43,17.85,44.08,44.83,72.41,0,0
55,975,976,974,976,14574,978.83,971.67,975.25,25.27,21.42,40.1,58.62,58.62,14574,11257
56,976,974.5,974.5,976.5,11593,978.74,971.97,975.36,25.24,19.04,37.2,37.93,47.13,11593,16110


/*
* Created by SharpDevelop.
* User: Terry
* Date: 9/5/2009
* Time: 3:01 PM
*
* To change this template use Tools | Options | Coding | Edit Standard Headers.
*/

using System;
using predictor.sine;

namespace predictor
{
 class Program
 {
  public static void Main(string[] args)
  {
   Console.WriteLine("Hello World!");

   Data data = new Data(500);

   Predictor predictor = new Predictor();
   predictor.setData(data.actual);
   predictor.loadCSVData();
   predictor.dumpData();
   predictor.createNetwork();
   predictor.LoadNeuralNetwork();
   predictor.generateTrainingSets();
   predictor.generateValidationTrainingSets();
   predictor.trainNetworkGeneticAlgorithm();

   //  predictor.trainNetworkBackprop();
   //  predictor.trainNetworkAnneal();
   //  predictor.TrainNetworkHybrid();

   predictor.display();
   predictor.SaveNeuralNetwork();

    Console.Write("Press any key to continue . . . ");
   Console.ReadKey(true);
  }
 }
}

the main neural net learning piece..
/*
* Created by SharpDevelop.
* User: Terry
* Date: 9/5/2009
* Time: 3:21 PM
*
* To change this template use Tools | Options | Coding | Edit Standard Headers.
*/

using System;
using System.Collections.Generic;

using Encog.Neural.Networks;
using predictor.sine;
using Encog.Neural.Networks.Layers;
using Encog.Neural.Activation;
using Encog.Neural.Data;
using Encog.Neural.NeuralData.Temporal;
using Encog.Neural.Data.Basic;
using Encog.Neural.Networks.Training.Propagation.Back;
using Encog.Neural.Networks.Training.Genetic;
using Encog.Neural.Networks.Training.Anneal;
using Encog.Neural.NeuralData.CSV;
using Encog.Util.Randomize;
using System.Data;
using predictor.util;
using System.Text;
using System.IO;
using System.Collections;

namespace predictor.sine
{
/// 
/// Description of Predictor.
/// 

 public class Predictor
 {
  private const int ACTUAL_SIZE = 500;
  private const int TRAINING_SIZE = 250;
  private const int INPUT_SIZE = 5;
  private const int OUTPUT_SIZE = 1;
  private const int NEURONS_HIDDEN_1 = 7;
  private const double MAX_ERROR = 0.05;
  private double  []    data;
  private BasicNetwork network;
  private TemporalNeuralDataSet temporalDataSet;
  private TemporalNeuralDataSet validationDataSet;
  private TemporalNeuralDataSet trainDataSet;

  public void setData(double [] data)
  {
   this.data = data;
  }



  public void createNetwork()
  {
   this.network = new BasicNetwork();
   this.network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 55));
   this.network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 28));
   this.network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 1));
   this.network.Structure.FinalizeStructure();

   // NetworkCODEC.arrayToNetwork(Predictor.RANDOM_NET, this.network);
  }

  public void generateTrainingSets()
  {

   this.temporalDataSet = new TemporalNeuralDataSet(Predictor.INPUT_SIZE, Predictor.OUTPUT_SIZE);
   this.temporalDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false));
   this.temporalDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, false, true));

   for (int i = 0; i < tp =" this.temporalDataSet.CreatePoint(i);" validationdataset =" new" i =" 
    Predictor.TRAINING_SIZE;" tp =" this.validationDataSet.CreatePoint(i);"> dataPair = this.validationDataSet.Data;

    //   IList dataPair = this.temporalDataSet.Data;

    foreach(BasicNeuralDataPair dp in dataPair)
    {
     BasicNeuralData inputData = (BasicNeuralData)dp.Input;

     for (int x = 0; x <>" + inputData.Data[x]);
     }
      BasicNeuralData idealData = (BasicNeuralData)dp.Ideal;

    for (int x = 0; x <>" + idealData.Data[x]);

    }

    BasicNeuralData predictedNd = (BasicNeuralData)this.network.Compute(inputData);

    for(int y = 0; y <>>" + predictedNd.Data[y]);
    }
    Console.WriteLine(" ");
   }
 
  }

  public void dumpData()
  {
   IList dataPair = this.trainDataSet.Data;
   foreach(BasicNeuralDataPair dp in dataPair)
   {
    BasicNeuralData inputData = (BasicNeuralData)dp.Input;
    for (int x = 0; x <>" + inputData.Data[x]);
   }

   BasicNeuralData idealData = (BasicNeuralData)dp.Ideal;

   for (int x = 0; x <>" + idealData.Data[x]);
  }

  Console.WriteLine(" ");
  }
 }

 public void trainNetworkBackprop()
 {
  Backpropagation train = new Backpropagation(this.network, this.trainDataSet, 0.001, 0.1);
  int epoch = 1;
  do
  {
   train.Iteration();
   Console.WriteLine("Iteration #" + epoch + " Error:" + train.Error);
   epoch++;
   this.SaveNeuralNetwork();
  }
 
  while ((epoch <> 0.03));
 }

 public void trainNetworkAnneal()
 {
  NeuralSimulatedAnnealing train = new NeuralSimulatedAnnealing(this.network, this.trainDataSet, 10, 2, 100);
  int epoch = 1;

  do
  {
   train.Iteration();
   Console.WriteLine("Iteration #" + epoch + " Error:" + train.Error);

   epoch++;

   //
   // important..save current state of learning..
   //
   this.SaveNeuralNetwork();
  } while ((train.Error > 0.04));

 }



 public void trainNetworkGeneticAlgorithm()
 {
  RangeRandomizer randomizer = new RangeRandomizer(950, 1100);
  NeuralGeneticAlgorithm train = new TrainingSetNeuralGeneticAlgorithm(this.network, randomizer, this.trainDataSet, 100, 0.6, 0.4);

  int epoch = 1;
  do
  {
   train.Iteration();
   Console.WriteLine("Iteration #" + epoch + " Error:" + train.Error);
   epoch++;
   //
   // important..save current state of learning..
   //
   this.SaveNeuralNetwork();
  }
  while ((train.Error > 0.04));
 }

 public void TrainNetworkHybrid()
 {
  Backpropagation train = new Backpropagation(this.network, this.trainDataSet, 0.00001, 0.1);
  double lastError = Double.MaxValue;
  int epoch = 1;
  int lastAnneal = 0;
  
  do
  {
   train.Iteration();
   double error = train.Error;

   Console.WriteLine("Iteration(Backprop) #" + epoch + " Error:" + error);
   //
   // important..save current state of learning..
   //
   this.SaveNeuralNetwork();

   if (error > 0.05)
   {
    if ((lastAnneal > 100) && (error > lastError || Math.Abs(error - lastError) < lastanneal =" 0;" lasterror =" train.Error;"> MAX_ERROR);
   }





 public void loadCSVData()
 {
  double [,] data = new CsvUtil().parseFile("C:\\Users\\Terry\\Documents\\dataOrig.csv");
  //   for(int x = 0; x < y =" 0;">" + data[x,y] );
  //    }
  //    Console.WriteLine("");
  //   }

  this.trainDataSet = new TemporalNeuralDataSet(5, 1);
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 1
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 2
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 3
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 4
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 5
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 6
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 7
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 8
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 9
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 10
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, true, false)); // input 11
  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, false, true)); // output 1
  //  this.trainDataSet.AddDescription(new TemporalDataDescription(TemporalDataDescription.Type.RAW, false, true)); // output 2

  for(int x = 0; x < (data.GetLength(0) / 2) - 2; x++)
  {
   TemporalPoint tp = this.trainDataSet.CreatePoint(x);
   tp.Data.SetValue(data[x,2], 0); // close
   tp.Data.SetValue(data[x,6], 1); // bolinger top
   tp.Data.SetValue(data[x,7], 2); // bolinger bottom
   tp.Data.SetValue(data[x,8], 3); // bolinger middle
   tp.Data.SetValue(data[x,9], 4); // dm plus
   tp.Data.SetValue(data[x,10], 5); // dm minus
   tp.Data.SetValue(data[x,11], 6); // dm adx
   tp.Data.SetValue(data[x,12], 7); // stoch fast
   tp.Data.SetValue(data[x,13], 8); // stoch slow
   tp.Data.SetValue(data[x,14], 9); // upticks
   tp.Data.SetValue(data[x,15], 10); // downticks
   tp.Data.SetValue(data[x,2], 11); // predict close bar 1 
   //    tp.Data.SetValue(data[x + 1,2], 12); 
   // predict close bar 2    
  }
  this.trainDataSet.Generate();
 }

 public void LoadNeuralNetwork()
 {
  this.network = (BasicNetwork)Serialize.Load("C:\\Users\\Terry\\Desktop\\download\\stock.net");
 }

 public void SaveNeuralNetwork()
 {
  Serialize.Save("C:\\Users\\Terry\\Desktop\\download\\stock.net", this.network);
 }
 }
}
the csv data importer
/*
* Created by SharpDevelop.
* User: Terry
* Date: 9/7/2009
* Time: 6:34 PM
*
* To change this template use Tools | Options | Coding | Edit Standard Headers.
*/

using System;
using System.Collections;
using System.Data;
using System.Text;
using System.IO;
using System.Text.RegularExpressions;

namespace predictor.util
{

 public class CsvUtil
 {
  public double[,] parseFile(String fileName)
  {
   StreamReader fileReader = new StreamReader(fileName);
   int numLines = this.countNumberOfLinesInFile(fileName);
   int numColumns = this.countNumberOfcolumnsInFile(fileName);
   double [,] lineItemsArray = new double[numLines,numColumns];

   String line = null;
   int x = 0;

   while ((line = fileReader.ReadLine()) != null)
   {
    string[] DataColumn = line.Split(new char[] {','});
    for(int y = 0; y < filereader =" new" line =" null;" x =" 0;" 
     line =" fileReader.ReadLine())" 
     filereader =" new" line =" null;" x =" 0;" 
     line =" fileReader.ReadLine();" x =" line.Split(new">

a piece of lava flow code..
/*
* Created by SharpDevelop.
* User: Terry
* Date: 9/5/2009
* Time: 3:03 PM
*
* To change this template use Tools | Options | Coding | Edit Standard Headers.
*/
using System;
using Encog.Neural.NeuralData.CSV;
using System.Data;
using System.IO;
using System.Text.RegularExpressions;

namespace predictor.sine
{
 class Data
 {
  //
  // the total number of elements in the data..
  //
  public double [] actual;
  //
  // retrieve the calculation of the curve at any particular point..
  //
  public static double calculateSine(double deg)
  {
   double rad = deg * (Math.PI / 180);
   double result = Math.Sin(rad);
   double output = ((int) (result * 100000.0)) / 100000.0;
   return output;
  }

  //
  // create all the actual data which will be used in all the training
  // and validation
  //
  public Data(int size)
  {
   //
   // make an array for which to hold every element in the training and
   // validation
   // steps
   //
   this.actual = new double[size];

   //
   // the starting position of the sine wave line..
   //

   int angle = 0;

   //
   // go through the number of elements in the all acutal data
   // and place it into an array..
   //
   for (int i = 0; i <>
the piece of code which allows the trained network to be saved and loaded..
/*
* Created by SharpDevelop.
* User: Terry
* Date: 9/6/2009
* Time: 9:53 AM
*
* To change this template use Tools | Options | Coding | Edit Standard Headers.
*/

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;

namespace predictor.sine
{
 /// 
 /// Description of Serialize.
 /// 
 public class Serialize
 {
  public static object Load(string filename)
  {
   Stream s = new FileStream(filename, FileMode.Open, FileAccess.Read, FileShare.None);
   BinaryFormatter b = new BinaryFormatter();
   object obj = b.Deserialize(s);
   s.Close();
   return obj;
  }

  public static void Save(string filename, object obj)
  {
   Stream s = new FileStream(filename, FileMode.Create, FileAccess.Write, FileShare.None);
   BinaryFormatter b = new BinaryFormatter();
   b.Serialize(s, obj);
   s.Close();
  }
 }
}

Comments

Roy S said…
Yeah but which goddam stocks did it pick!?!?

Popular posts from this blog

clonezilla - creating a wifi ad-hoc hotspot and running a ssh server

Here are some notes on connecting to a wifi-hotspot from a running clonezilla
live-cd.

[ target machine to save a backup hard-disk clone/image ]
[ target machine is also running a wifi hotspot and an ssh server ]
Ubuntu 11.04 - Natty Narwhal

terry@terry-HP:~$ uname -a
Linux terry-HP 2.6.38-10-generic #46-Ubuntu SMP Tue Jun 28 15:07:17 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux
terry@terry-HP:~$

[ Creating an 'ad-hoc' wifi spot ]
Click the 'up down arrow' icon ( or what have you )


[ choose menu item to create a wifi ad-hoc hotspot ]

[ create the ad-hoc wifi hotspot ]

[ the eth1 connection is the created ad-hoc network with an essid of 'terry' ]

[ the ad-hoc wifi hotspot is now visible on the 'host' computer and other computer as well now ]

[ enter the following on the machine being cloned with clonezilla
[ at the appropriate place in the clonezilla backup image step ]
terry@terry-HP:~$ ip link set wlan0 down
terry@terry-HP:~$ iwconfig wlan0 mode ad-hoc
terry@…

Translators, maps, conduits, and containers

Interpreters
The Amiga had a hardware emulator which transformed instruction to instruction to a dedicated x86 hardware interpreter in the Amiga could run Microsoft DOS spreadsheet programs in the Amiga OS.

MS-DOS on Amiga via Sidecar or Bridgeboard[edit]
MS-DOS compatibility was a major issue during the early years of the Amiga's lifespan in order to promote the machine as a serious business machine. In order to run the MS-DOS operating system, Commodore released the Sidecar for the Amiga 1000, basically a 8088 board in a closed case that connected to the side of the Amiga. Clever programming (a library named Janus, after the two-faced Roman god of doorways) made it possible to run PC software in an Amiga window without use of emulation. At the introduction of the Sidecar the crowd was stunned to see the MS-DOS version of Microsoft Flight Simulator running at full speed in an Amiga window on the Workbench.

Later the Sidecar was implemented on an expansion card named "Bridgebo…

Use Gwenview to upload images to picasaweb.google.com

Use Gwenview to upload images to https://picasaweb.google.com/

install the following KDE4 plugins:

terry@narwhal:~/download$ sudo apt-get install kipi-plugins
From Linux Clicks...
the Gwenview application...

From Linux Clicks...