|
Post by ezmoney on Aug 10, 2013 19:18:57 GMT -5
I am trying to do some Market analsys.
There are nearly 8000 stocks and each 100 days of 6 values.
I want to do a correlation coefficient of all stocks.
I guess I could break it down and divide the stocks by symbol.
This would generate 8000 files for the 100 days.
Thus I would pull in the stock1 data and stock2 data do the correlation then read stock3 data and go again to the limit of 8000.
Once there the stock1 would be replaced with stock 2 and start from 1 to 8000 again not doing anything that had already been done like stock1 and stock2 and stock2 and stock1. That would generate the same and just a waist if time. Nor corelating anything with itself.
I was wondering if Sqlight would help hold the data in one file. Make easy retrival and maybe update with minimum programming. The thing about using that is when it closes the data is gone. Thus I could generate the data file and then use it for different analsys.
I could do it as a random file also.
Any suggestions?
I don't care if it runs all week.
I only have to do half and never correlate anything with itself.
The amount of data is very high and the out put is one line for each correlation.
I'll have to dump the output to file and then read it.
Basically as I see it is to load the data and start the algorithm to do each stock with each other stock.
Once some value is placed in the array then it would never do those two again.
|
|
metro
Full Member
Posts: 180
|
Post by metro on Aug 10, 2013 20:10:05 GMT -5
|
|
|
Post by ezmoney on Aug 18, 2013 16:27:16 GMT -5
Thanks Laurie:
I was looking more at the low and high of the daily values the wider the percent the higher the return on investment.
I've managed to weed out a lot of stocks due to high price and low volume.
I wrote the code in Run Basic and it takes about 8 min to get thru all of them about 190 stocks, but as more days are added that should increase.
It works nicely and then I can sort them into order by that high low price.
I have about 66 days in the system so that time should go to over 30 min of computation time for 250 days of data.
Plug and crank, garbage in and garbage out.
Laurie let me know if you have any other ideas I'd like to buy at the low and sell at the high.
Maybe a buy and sell price predictor based on previous data might get close.
If anyone else has any ideas please post.
Anybody know about PID Predictors/Controlers or maybe regression might work a day out?
The more times the price gyrates up and down the more likely that I can capture some piece of the action.
Even a 1.01 percent gain for 250 trades(about 1 year) should compound out to near 12 times the orginal investment.
Data shows a 2.5 and sometimes 3.5 or better on occation, but the freqency of this market run ups is generally followed by a sharp pull back.
Then it slowly claws its way back till the next big pull back.
Thus if I keep the data up to date then it should repeat and I should be able to buy near the low and sell near the high and make a lot of cash profit long term.
Those dollars mount up.
Using run basic I've designed the program to do just that now the tuning begins.
Mostly just finding where that ideal peak is in the sell side.
You want to make the most percent yield the most number of times. The system should tell me that ideal number and a buy number where I can make the most on my investment.
Thanks Run Basic it works rather uniquely.
|
|