Want to make own trading bot? You should check Markov Chains

View this thread on: d.buzz | hive.blog | peakd.com | ecency.com
·@whd·
0.000 HBD
Want to make own trading bot? You should check Markov Chains
<html>
<h1><img src="http://profittradingbot.com/images/headerImg.jpg" width="1920" height="801"/>Programming language</h1>
<p>&nbsp;&nbsp;&nbsp;&nbsp;It does not matter whichy technology and programming language you choose to write trading bot, &nbsp;you should &nbsp;focus on algorithms that will help your bots run better. &nbsp;A good bot should anticipate market behavior. Bring us the profit and guarantee a low risk of capital loss. Basic approach would be seting 'border value' and &nbsp;buying when red selling when &nbsp;&nbsp;the profit exceeds the assumed percentage. Such approach may works but &nbsp;probability of loss is high.</p>
<h1>Markov chains</h1>
<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;To understand the markov chains will present several situations in which they apply.&nbsp;</p>
<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Student takes part in the class once a week. If in one week the student came prepared the probability that he will be prepared in the next week is 0.7 &nbsp;on the other &nbsp;hand, if in a given week the student is unprepared for classes, the next &nbsp;week the probability of being prepared is 0.2. &nbsp;Markov chains will help you answer the following questions, &nbsp;If a student is unprepared for a week(red), how long will it take for him to prepare(green)? Or &nbsp;If in a given week the student is prepared for the class, how long will we have to wait until he is imprepared?(how long will it be green?)</p>
<h1>&nbsp;If it's so easy then why is it so hard?&nbsp;</h1>
<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;The biggest problem is gathering the right amount of data to determine the probability of a particular event. &nbsp;In the beginning the bot should walk without turning real money. &nbsp;The later bot starts functioning on the real exchange, the better the results will be.</p>
<h1>Additional resources</h1>
<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;If you still do not understand how &nbsp;markov chains works, check http://setosa.io/ev/markov-chains/ for visual explenation or http://www.statslab.cam.ac.uk/~rrw1/markov/ for full course in this field.</p>
</html>
👍 , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,