Backtesting with ASCII data painfully slow......

Questions about MultiCharts and user contributed studies.
bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 02 Dec 2015

I am hoping somebody more informed than me can explain why it takes so long to backtest using ASCII data??

So I've downloaded all my 1 minute and 1 day data to my SSD using QCollector. I've ASCII mapped all my US stocks (all 6200 of them).

Can somebody explain what Multicharts does whilst it is "loading data..." before the actual backtest. Even when trying to just backtest all stocks starting with the letter "A" instead of all 6200 stocks, it takes forever. The "loading data...." window is stuck on "0 quotes received". It takes over half an hour to load the data for a 2 month backtest using 15 minute and 1 day data using only stocks that start with the letter "A". It takes over 8 hours to get all 6200 stocks loaded.

I would have assumed that since the data resides on my SSD, it would be substantially quicker. Or does multicharts somehow "translate" the data into something multicharts can read?

What am I missing here? What can I do to fix this issue? Or am I just expecting too much?

User avatar
TJ
Posts: 7176
Joined: 29 Aug 2006
Location: Global Citizen
Has thanked: 990 times
Been thanked: 2049 times

Re: Backtesting with ASCII data painfully slow......

Postby TJ » 02 Dec 2015

I am hoping somebody more informed than me can explain why it takes so long to backtest using ASCII data??

So I've downloaded all my 1 minute and 1 day data to my SSD using QCollector. I've ASCII mapped all my US stocks (all 6200 of them).

Can somebody explain what Multicharts does whilst it is "loading data..." before the actual backtest. Even when trying to just backtest all stocks starting with the letter "A" instead of all 6200 stocks, it takes forever. The "loading data...." window is stuck on "0 quotes received". It takes over half an hour to load the data for a 2 month backtest using 15 minute and 1 day data using only stocks that start with the letter "A". It takes over 8 hours to get all 6200 stocks loaded.

I would have assumed that since the data resides on my SSD, it would be substantially quicker. Or does multicharts somehow "translate" the data into something multicharts can read?

What am I missing here? What can I do to fix this issue? Or am I just expecting too much?
This is a hardware and Windows legacy issue, not MultiCharts.

ASCII are fetched one record at a time.
This is how the operating system works. MultiCharts has no say in it.
SSD helps, but you are still bounded by the "one record at a time" mode of operation.

You should import the data into MultiCharts first, so that the data resides inside MultiCharts, then the testing will be faster.


ps. 6,200 symbol? that's a lot no matter how you slice it; you are probably dealing with BILLIONS of data points. So importing the data first makes sense.

pps. you are not the first person trying to backtest with ASCII data. Do a search in the forum and you will see the same discussion.

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 02 Dec 2015

Unfortunately importing is not really an option because I would have to manually import 6200 symbols x 2 (1 minute data and 1 day data). That would take me forever. That's why I have mapped the ASCII data.

User avatar
TJ
Posts: 7176
Joined: 29 Aug 2006
Location: Global Citizen
Has thanked: 990 times
Been thanked: 2049 times

Re: Backtesting with ASCII data painfully slow......

Postby TJ » 02 Dec 2015

Unfortunately importing is not really an option because I would have to manually import 6200 symbols x 2 (1 minute data and 1 day data). That would take me forever. That's why I have mapped the ASCII data.
You can import the 1 min data.
MultiCharts will construct the daily bars from the 1 min data.

User avatar
TJ
Posts: 7176
Joined: 29 Aug 2006
Location: Global Citizen
Has thanked: 990 times
Been thanked: 2049 times

Re: Backtesting with ASCII data painfully slow......

Postby TJ » 02 Dec 2015

Unfortunately importing is not really an option because I would have to manually import 6200 symbols x 2 (1 minute data and 1 day data). That would take me forever. That's why I have mapped the ASCII data.
Please contact Tech Support, maybe they can suggest an easy way to import 6,200 ASCII data symbols.

User avatar
fbertram
Posts: 166
Joined: 16 Oct 2014
Location: Seattle, USA
Has thanked: 36 times
Been thanked: 73 times
Contact:

Re: Backtesting with ASCII data painfully slow......

Postby fbertram » 03 Dec 2015

Hi BlueFightingCat,

I am seeing the same issue here. Loading of data is very slow, even though I am only working with about 200 stocks on 5 minute charts. I have imported the data before, but from what I saw, this did *not* speed things up. Also, this becomes a major nuisance when receiving data updates.

I am using a 16-core machine from Google and what I see is that MultiCharts is not using the cores efficiently during data loading. It seems that data loading is sequential and not multi-threaded as it should be. This is all good for a single chart, but when working with Portfolio Trader this is a big waste. There is huge potential to optimize the code here...


Cheers, Felix

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 03 Dec 2015

There is huge potential to optimize the code here...
This definitely seems to be the case.

User avatar
TJ
Posts: 7176
Joined: 29 Aug 2006
Location: Global Citizen
Has thanked: 990 times
Been thanked: 2049 times

Re: Backtesting with ASCII data painfully slow......

Postby TJ » 03 Dec 2015

Hi BlueFightingCat,

I am seeing the same issue here. Loading of data is very slow, even though I am only working with about 200 stocks on 5 minute charts. I have imported the data before, but from what I saw, this did *not* speed things up. Also, this becomes a major nuisance when receiving data updates.

I am using a 16-core machine from Google and what I see is that MultiCharts is not using the cores efficiently during data loading. It seems that data loading is sequential and not multi-threaded as it should be. This is all good for a single chart, but when working with Portfolio Trader this is a big waste. There is huge potential to optimize the code here...


Cheers, Felix
I cringe when people talk fudge like this.

What is slow?

MultiCharts is doing its darnest to make the finest trading product,
and you come here to say it is slow without any hard figures to back it up.

If you have "imported" the data and it is "no different". Can you tell us the loading time? Before and after?
Or is it just a "feeling"?


As a side note, you might have a 16-core machine, but do you know your bus speed? Is it 40x faster than a 4-core machine?
(in case you are not aware, your harddisk throughput is limited by your bus)

You said 200 stocks, but did not say how many years.
Do you know how many data records you are loading?
hint: 200 stocks x 5min x 60 x trading hours per day x days per year x number of years
(plus date, time, OHLC, vol)

ps. OP was dealing with over 10 BILLION records.

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 03 Dec 2015

Well I can give you direct comparison using competitive software (costs about double what multicharts costs but also comes with porfolio testing including money/portfolio management). Same strategy (or as same as it can be using different coding languages) same amount of stocks and same amount of backtesting time i.e. 10 billion records as you mentioned. The competing software takes 30 seconds to load the very same ASCII information into the system (compared to about 8 hours for multicharts). Then the backtesting takes approx the same amount of time as multicharts.

What seems to be the problem with multicharts is that it takes forever to "load the data". I have no idea what is going on whilst the data is loading. The competing software does the same "loading" in about 30 seconds.

Once the data is loaded the backtest happens in approx the same time for both multicharts and the competing software.

User avatar
TJ
Posts: 7176
Joined: 29 Aug 2006
Location: Global Citizen
Has thanked: 990 times
Been thanked: 2049 times

Re: Backtesting with ASCII data painfully slow......

Postby TJ » 03 Dec 2015

Well I can give you direct comparison using competitive software (costs about double what multicharts costs but also comes with porfolio testing including money/portfolio management). Same strategy (or as same as it can be using different coding languages) same amount of stocks and same amount of backtesting time i.e. 10 billion records as you mentioned. The competing software takes 30 seconds to load the very same ASCII information into the system (compared to about 8 hours for multicharts). Then the backtesting takes approx the same amount of time as multicharts.

What seems to be the problem with multicharts is that it takes forever to "load the data". I have no idea what is going on whilst the data is loading. The competing software does the same "loading" in about 30 seconds.

Once the data is loaded the backtest happens in approx the same time for both multicharts and the competing software.
That is good to hear. You have the times for comparison, this is worth considering.

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 03 Dec 2015

It's a really pity because I quite like multicharts. But somehow it seems that it was major limitations for backtesting huge amounts of data.

User avatar
fbertram
Posts: 166
Joined: 16 Oct 2014
Location: Seattle, USA
Has thanked: 36 times
Been thanked: 73 times
Contact:

Re: Backtesting with ASCII data painfully slow......

Postby fbertram » 03 Dec 2015

Hi TJ,
I cringe when people talk fudge like this.
I am sure we can all benefit from each other's knowledge and experiences - but that will only happen if we talk to each other on eye level. Thank you.

So here is the information you've been wondering about:
* the machine I am using is a 16-core Ivy-Bridge Xeon E5 v2 w/ 60GB of memory (what Google calls n1-standard-16). Processor speed is 2.5GHz. Google does not publish additional information on the machines.
* I am loading 200 stocks, 5 minute resolution, 10 years
* loading time for these data is about 25 minutes
MultiCharts is doing its darnest to make the finest trading product, and you come here to say it is slow without any hard figures to back it up.
Don't get me wrong. MultiCharts is an awesome product and I use it every day. I have no intention in bashing the product or its developers. But at the same time, there are a bunch of things that are impacting the way I work and my productivity - and this data loading speed is one of them. The only way to give feedback to the MultiCharts team about our needs and priorities is to talk openly about them.
If you have "imported" the data and it is "no different". Can you tell us the loading time? Before and after? Or is it just a "feeling"?
I can't tell you the loading time right now, as I have no way to compare things this very moment without putting in several hours of work. But I *had* the data imported until 3 months ago and loading data was painfully slow then. When I updated the data I decided to not go through the import again and use ASCII mapping instead. At that point in time I did not notice any additional penalty. So no, this is not just a feeling but something that I have experienced first hand.

Now what makes me believe this can be sped up?

* CPU load, see the screenshot below. As you can see, none of the 16 cores are busy. We see little spikes, roughly 3 seconds in duration, of about 80% CPU load. These hop from core to core. This looks like there is a single data loading thread, which takes about 3 seconds to complete, which is assigned randomly to the cores. Now this process is clearly not CPU-bound.

Image

* Disk load, see screenshot below. TsServer is loading about 130kB/s. The system is reading about 2.5MB/s and writing about 4MB/s. I am wondering why a huge amount of data is written back and I am wondering why the majority of i/o is owned by the system. Anyways, this leaves us with a total disk throughput of about 6.5MB/s. At this rate, the process is not I/O bound either.

Image
ASCII are fetched one record at a time. This is how the operating system works. MultiCharts has no say in it.
I beg to disagree. Probably this is implemented as a memory-mapped file (which would explain why all the i/o is owned by the system). The operating system will load this to memory at its own discretion, whenever a page fault occurs. The operating system will read this as raw data, no parsing of the data occurs at this stage. Throughput will not be different between ASCII and binary data, but ASCII will have 2 drawbacks: (1) it takes up more bytes for the same information and (2) we need the CPU to parse the CSV data and convert ASCII to binary.

Parsing is not the issue here, we can see that the CPU has 90% spare cycles (or 15 idle cores). Disk throughput shouldn't be the bottleneck either, but depending on the access pattern, seek times might kill the throughput. There is a *lot* that can be done on the programming side to steer this into the right direction.

Cheers, Felix
Attachments
import_disk.png
import_disk.png (32.47 KiB) Viewed 2480 times
import_cpu.png
import_cpu.png (43.87 KiB) Viewed 2482 times

User avatar
fbertram
Posts: 166
Joined: 16 Oct 2014
Location: Seattle, USA
Has thanked: 36 times
Been thanked: 73 times
Contact:

Re: Backtesting with ASCII data painfully slow......

Postby fbertram » 03 Dec 2015

... one more data point to back up my claims. The total size of the ASCII data is about 3GB. I can copy these data on the drive in about 35 seconds. During data copy, the operating system will
- read the data from disk to memory
- write the data from memory back to disk

A few thoughts how this relates to the data loading topic above:
- a read from disk to memory should be faster than a copy
- the total amount of data copied spans back more than 10 years, speeding up the read even more. this assumes that the loader is slightly more advanced than just loading in all data and dropping everything outside the simulation window.
- the parser can be sped up by a huge amount, as we have a lot of idle CPU power to throw at this

Putting this together, I arrive at the following very crude estimate how fast the data *should* load on my machine:
* load time from disk: 2 minutes. I have padded this number by a huge amount.
* parse time: currently about 23 minutes after I subtracted the net load time. We can probably speed this up by a factor of 10, leaving about 3 minutes
* in total, I'd say a realistic target to hit would be 5 minutes... not 25.

The limiting factor might actually be something else. I haven't spent any time researching, but telling from the gdb extension, MultiChart's underlying database might be Borland's InterBase (now Firebird). Possibly, data loading involves pushing the data into the database and possibly that's where the bottleneck is. It might not be feasible for MultiCharts to change any of these underpinnings... but that's a quite different story than "can't be done".

Cheers, Felix

User avatar
fbertram
Posts: 166
Joined: 16 Oct 2014
Location: Seattle, USA
Has thanked: 36 times
Been thanked: 73 times
Contact:

Re: Backtesting with ASCII data painfully slow......

Postby fbertram » 03 Dec 2015

I spent a few minutes to code a little PerlScript that loads ASCII data from disk, parses them and puts them into RAM; see below. This is a brute-force implementation, reading in the ASCII files line by line and from the very beginning. It will look at the year and drop everything before 2005. Then, it will push this all into a giant hash structure. After reading a bunch of stocks, it will access one of the records to show that it actually loaded something.

I tried this on a little Macintosh that I had Perl running on. This is a 2.6GHz i5 processor with only 8GB of RAM which is why I had to shorten the test a little bit. I was able to load 50 stocks in a little less than 3 minutes. This would result in 12 minutes to load 200 stocks, which is still twice as fast as MultiCharts. It is worth mentioning here that Perl isn't the fastest language on the planet and that absolutely no optimization has been done here.

I will see if I can find the time to install Perl on my Google machine, so that I can provide data that compare directly to the statements made in previous posts.


Cheers, Felix

Code: Select all

#!/usr/bin/perl -w
#===============================================================================
# File: data_load_test.pl
# Description: simple loader to demonstrate loading of ASCII data
# History: FUB, 2015xii03, created
#===============================================================================

use strict; # keep Perl rules strict!
use warnings;
use Class::Struct;

our %dataStore = ();
our $numStocksLoaded = 0;
our $maxStocksToLoad = 50;

struct( QuoteRecord => [
open => '$',
high => '$',
low => '$',
close => '$',
volume => '$'
]);

#-------------------------------------------------------------------------------
sub loadFile {
my ($symbol, $dataFile) = @_;

print "loading $symbol from $dataFile\n";

open(my $fileHandle, $dataFile) || die "can't open file $dataFile";
while(my $line = <$fileHandle>) {
$line =~ s/[\n\t\r]//g;

my @elements = split(",", $line);
my $date = $elements[0];
my $time = $elements[1];
my $open = $elements[2];
my $high = $elements[3];
my $low = $elements[4];
my $close = $elements[5];
my $volume = $elements[6];

my $key = $symbol . "," . $date . "," . $time;

$date =~ m/(.*)\/(.*)\/(.*)/;
my $month = $1;
my $day = $2;
my $year = $3;

if ($year >= 2005) {
$dataStore{$key} = QuoteRecord->new(
open => $open,
high => $high,
low => $low,
close => $close,
volume => $volume
);
}
}
close($fileHandle);
}

#-------------------------------------------------------------------------------
sub loopFiles {
my ($dataPath) = @_;

opendir(my $dirHandle, $dataPath) || die "can't open dir $dataPath";
while (readdir $dirHandle) {
my $dataFile = $_;
if ($dataFile =~ m/(.*)\.txt$/) {
loadFile($1, $dataPath . "/" . $dataFile);
$numStocksLoaded++;
if ($numStocksLoaded >= $maxStocksToLoad) {
return;
}
}
}
closedir($dirHandle);
}

#-------------------------------------------------------------------------------
sub main {
my ($dataPath) = @_;

my $startTime = localtime;
print "started $startTime\n";

loopFiles($dataPath);

my $endTime = localtime;
print "finished $endTime, loaded $numStocksLoaded stocks\n";

my $key = "AA,10/01/2015,13:25";
my $record = $dataStore{$key};

print "key = " . $key . "\n";
print "o = " . $record->open . "; ";
print "h = " . $record->high . "; ";
print "l = " . $record->low . "; ";
print "c = " . $record->close . "; ";
print "v = " . $record->volume . "\n";

return 0;
}

#-------------------------------------------------------------------------------

exit main("/Users/fbertram/Documents/Trading/Data/Kibot/Stocks/5m");

#===============================================================================
# end of file

User avatar
fbertram
Posts: 166
Joined: 16 Oct 2014
Location: Seattle, USA
Has thanked: 36 times
Been thanked: 73 times
Contact:

Re: Backtesting with ASCII data painfully slow......

Postby fbertram » 04 Dec 2015

OK, so I played around with this some more. I have changed the code from Perl to C++ and I am using multiple threads. The main thread loads the data from disk. We don't want multiple threads here, as the seeks will otherwise probably kill us. Once the data are loaded, they are passed on to a bunch of threads that parse them. This can be easily done in parallel and we have a good chance of loading multiple cores here. I was using 4 threads for this, as my little i5 has only 2 cores. On the Xeon I am using, a higher number (16?) probably works better.

Here are the results on a MacBook Pro with 2.6GHz i5, 8GB of RAM and an SSD (I will test this on Windows as well, as soon as I have a chance to compile this on Windows):
# of stocks loaded = 200
# of threads = 4
runtime in seconds = 200.843

So, I am loading and parsing these data in a little over 3 minutes... while MultiCharts takes 25 for the same job (not 100% fair, my MultiCharts runs on a Xeon server but that should be faster). So, I hope this is not fudge, but enough evidence that there is lots of room for improvement. Please, please, please dear MultiCharts developers: improve the data loading performance in future versions.

Cheers, Felix

Code: Select all

//==============================================================================
// File: data_load_test.cpp
// Description: simple loader to demonstrate loading of ASCII data
// History: FUB, 2015xii04, created
//==============================================================================

#include <iostream>
#include <fstream>
#include <sstream>
#include <vector>
#include <map>
#include <string>
#include <thread>
#include <dirent.h>
#include <chrono>

std::string asciiPath = "/Users/fbertram/Documents/Trading/Data/Kibot/Stocks/5m";
const int maxStocksToLoad = 200;
int numStocksLoaded = 0;
const int maxThreadsToUse = 4;

struct ThreadInfo {
std::thread* thread;
bool done;
std::string fileName;
std::string symbol;
std::string asciiData;

ThreadInfo() {
thread = nullptr;
done = false;
fileName = "";
symbol = "";
asciiData = "";
}
};

struct QuoteRecord {
float open;
float high;
float low;
float close;
int volume;
};

std::map<std::string, QuoteRecord> dataPool;

//==============================================================================
void parseLine(std::string symbol, std::string line) {
line += ",";
size_t start = 0;
size_t end;
std::vector<std::string> elements;
while((end = line.find(',', start)) != std::string::npos) {
std::string element = line.substr(start, end - start);
elements.push_back(element);
start = end + 1;
}

#if 0
std::cout << line << std::endl;
std::cout << "d = " << elements[0] << std::endl;
std::cout << "t = " << elements[1] << std::endl;
std::cout << "o = " << elements[2] << std::endl;
std::cout << "h = " << elements[3] << std::endl;
std::cout << "l = " << elements[4] << std::endl;
std::cout << "c = " << elements[5] << std::endl;
std::cout << "v = " << elements[6] << std::endl;
exit(-1);
#endif

std::string date = elements[0] + "/";
start = 0;
std::vector<std::string> mdy;
while((end = date.find('/', start)) != std::string::npos) {
std::string element = line.substr(start, end - start);
mdy.push_back(element);
start = end + 1;
}

#if 0
std::cout << elements[0] << std::endl;
std::cout << "m = " << mdy[0] << std::endl;
std::cout << "d = " << mdy[1] << std::endl;
std::cout << "y = " << mdy[2] << std::endl;
exit(-1);
#endif

if (std::stoi(mdy[2]) >= 2005) {
QuoteRecord record;
record.open = std::stof(elements[2]);
record.high = std::stof(elements[3]);
record.low = std::stof(elements[4]);
record.close = std::stof(elements[5]);
record.volume = std::stoi(elements[6]);

std::string key = symbol + "," + elements[0] + "," + elements[1];
//std::cout << key << std::endl;

dataPool[key] = record;
}
}

//==============================================================================
void parseFile(ThreadInfo& ti) {
std::cout << "parsing " << ti.fileName << std::endl;

size_t start = 0;
size_t end;
while((end = ti.asciiData.find('\n', start)) != std::string::npos) {
std::string line = ti.asciiData.substr(start, end - start);
parseLine(ti.symbol, line);
start = end + 1;
}

std::cout << "finished parsing " << ti.fileName << std::endl;
ti.done = true;
}

//==============================================================================
std::string loadFile(std::string fileName) {
std::cout << "loading " << fileName << std::endl;

std::ifstream is(fileName);
std::stringstream ss;
ss << is.rdbuf();

is.close();

std::cout << "finished loading " << fileName << std::endl;

numStocksLoaded++;
return ss.str();
}

//==============================================================================
void printRecord(std::string key) {
QuoteRecord quote = dataPool[key];
std::cout << "k = " << key << std::endl;
std::cout << "o = " << quote.open << std::endl;
std::cout << "h = " << quote.high << std::endl;
std::cout << "l = " << quote.low << std::endl;
std::cout << "c = " << quote.close << std::endl;
std::cout << "v = " << quote.volume << std::endl;
std::cout << std::endl;
}

//==============================================================================
int main(int argc, char* argv[]) {
const auto start = std::chrono::system_clock::now();

std::vector<std::string> asciiFiles;

DIR* dir = opendir(asciiPath.c_str());

while(struct dirent* dp = readdir(dir)) {
if (std::string(dp->d_name).find(".txt") != std::string::npos) {
asciiFiles.push_back(std::string(dp->d_name));
}
}

ThreadInfo threadInfo[maxThreadsToUse];
std::vector<std::string>::iterator iter;
for (iter = asciiFiles.begin(); iter != asciiFiles.end(); iter++) {
std::string buf = loadFile(asciiPath + "/" + *iter);

int slot = -1;
while (slot < 0) {
// wait for thread slot
for (int i = 0; i < maxThreadsToUse; i++) {
if (!threadInfo[i].thread) {
// found empty slot
slot = i;
break;
}
if (threadInfo[i].thread && threadInfo[i].done) {
// found finished thread
threadInfo[i].thread->join();
threadInfo[i].thread = nullptr;
slot = i;
break;
}
}
if (slot < 0) {
std::this_thread::sleep_for(std::chrono::milliseconds(100));
}
}

threadInfo[slot].done = false;
threadInfo[slot].asciiData = buf;
threadInfo[slot].fileName = *iter;
threadInfo[slot].symbol = iter->substr(0, iter->find(".txt"));
threadInfo[slot].thread = new std::thread(parseFile, std::ref(threadInfo[slot]));

if (numStocksLoaded >= maxStocksToLoad) break;
}

// wait for all threads to finish
for (int i = 0; i < maxThreadsToUse; i++) {
if (threadInfo[i].thread) {
threadInfo[i].thread->join();
}
}

const auto stop = std::chrono::system_clock::now();
const auto runtime = std::chrono::duration_cast<std::chrono::nanoseconds>(stop - start).count();

std::cout << "# of stocks loaded = " << maxStocksToLoad << std::endl;
std::cout << "# of threads = " << maxThreadsToUse << std::endl;
std::cout << "runtime in seconds = " << runtime / 1e9 << std::endl;
std::cout << std::endl;

printRecord("AA,10/01/2015,13:25");
printRecord("CCU,10/01/2015,13:25");
printRecord("QCOM,10/01/2015,13:25");
printRecord("REGN,10/01/2015,13:25");
}

//==============================================================================
// end of file

orion
Posts: 250
Joined: 01 Oct 2014
Has thanked: 65 times
Been thanked: 104 times

Re: Backtesting with ASCII data painfully slow......

Postby orion » 10 Dec 2015

Good point. While the portfolio trader simulation itself can't be parallelized for understandable reasons, MC team should look into parallelizing the portfolio trader data load in future version since that can significantly improve the user experience.

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 20 Jan 2016

I've posted a "feature request" regarding this here:

https://www.multicharts.com/pm/viewissu ... no=MC-1987

If you're interested and think this would be useful go and vote!

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 23 Sep 2016

I wonder whether the Multicharts team has looked into this at all?

User avatar
Alex MultiCharts
Posts: 194
Joined: 09 Aug 2013
Has thanked: 43 times
Been thanked: 76 times

Re: Backtesting with ASCII data painfully slow......

Postby Alex MultiCharts » 23 Sep 2016

All feature requests are forwarded to the management of the company and are evaluated in a timely manner.
Please note that even though we value your opinion not all requests can be implemented due to the fact that some features do not fit into our current roadmap.

User avatar
fbertram
Posts: 166
Joined: 16 Oct 2014
Location: Seattle, USA
Has thanked: 36 times
Been thanked: 73 times
Contact:

Re: Backtesting with ASCII data painfully slow......

Postby fbertram » 23 Sep 2016

Hi Alex,

I understand that you will never be able to implement all feature wishes with the limited resources you have, after all you have a business to run.

But: there are a few issues with MultiCharts that are largely affecting productivity and workflows, at least for some people. This is one of them, as it costs me many, many CPU hours to wait for MultiCharts to load data. I my personal opinion, issues like this should rank much higher, than making cosmetic changes to the UI.

Best regards, Felix

User avatar
Alex MultiCharts
Posts: 194
Joined: 09 Aug 2013
Has thanked: 43 times
Been thanked: 76 times

Re: Backtesting with ASCII data painfully slow......

Postby Alex MultiCharts » 23 Sep 2016

Hello fbertram,

Thank you for your suggestion, I will forward it to the management.

bluefightingcat
Posts: 38
Joined: 31 Oct 2015
Has thanked: 1 time
Been thanked: 2 times

Re: Backtesting with ASCII data painfully slow......

Postby bluefightingcat » 24 Sep 2016

I agree with fbertram. In addition to Multicharts I use competing software that is substantially faster (I think posted the exact times somewhere in previous posts). Clearly it's possible to fix this. Maybe Multicharts is just built in away that makes it hard to get this fixed.


Return to “MultiCharts”