NAME
PDL::IO::BareStore - Simple PDL extension for reading 2 dimensional "Big Data"
SYNOPSIS
use PDL::IO::BareStore;
# simple way to tarnsform CSV files into "database" file
# perl -E 'say join ",", 1..14 for 1..1000' | perl -nF, -e 'chomp;print pack $p, @F}BEGIN{$p = shift' "C*" > ./output.rdb
my $D = new PDL::IO::BareStore(
file => 'output.rdb', # file name
dims => [14 * 1, 100], # 14 record, sizeof(C) is 1, 100 line each time
# Not quite handy, any suggestion?
type => 'C*', # How you packed the database
readCount => 5, # read file for five times
); # only readCount is optional, you should specify those parameters
while ( $D->nextBlock(\my $b) > 0 ) { # sequentially read the file
# $b is a piddle containing a subset of the data
# do something with $b, which is a subset of output.rdb
}
DESCRIPTION
Written for the 2-Dimensional "Big data" set reading. As I got "Out of Memory" for loading a large data into piddle with PDL::IO::FastRaw, and still cannot figure out how to use PDL::IO::FlexRaw; This module provide a simple wrapper for reading large binary data into smaller piddles.
kmx's PDL::IO::DBI is a really good solution, but I don't want to use Relational databases to store my data.
HISTORY
SEE ALSO
You May also interested in PDL::IO::FlexRaw PDL::IO::DBI
AUTHOR
Kwok Lok Chung, Baggio, <rootkwok @ cpan.org>
LICENSE
Copyright (C) 2015 by Kwok Lok Chung, Baggio.
This work is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/4.0/.