- 30 Jun, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 29 Jun, 2015 4 commits
-
-
Kirill Smelkov authored
Instead of verifying only min/max/len of the result, we can verify that the result is explicitly the array we expect, especially that it is easier to do and less lines with just appropriate arange() and array_equal(). /cc @Tyagov
-
Kirill Smelkov authored
As explained in previous commit, real_data tail was ending without \n 99988,99989\n99990,99991,99992,99993,99994,99995,99996,99997,99998,99999\n100000 and because DataStream_copyCSVToDataArray() processes data in full lines only, the tail was lost. Fix t by making sure the last line is always terminated properly with \n. /cc @Tyagov
-
Kirill Smelkov authored
Consider this: In [1]: l = range(100000) In [2]: min(l) Out[2]: 0 In [3]: max(l) Out[3]: 99999 In [4]: len(l) Out[4]: 100000 so if we assert that zarray min=0 and max=99999 the length should be max+1 which is 100000. NOTE the length is not 100001, as one would guess from test number sequence created at the beginning of the test: def chunks(l, n): """Yield successive n-sized chunks from l.""" for i in xrange(0, len(l), n): yield l[i:i+n] ... number_string_list = [] for my_list in list(chunks(range(0, 100001), 10)): number_string_list.append(','.join([str(x) for x in my_list])) real_data = '\n'.join(number_string_list) because processing code "eats" numbers till last \n and for 10001 last \n is located before 100000: 99988,99989\n99990,99991,99992,99993,99994,99995,99996,99997,99998,99999\n100000 I will fix input data generation in the following patch. /cc @Tyagov
-
Kirill Smelkov authored
When we conditionally create new BigArray for appending data, we should create it as empty, because in DataStream_copyCSVToDataArray() creation is done lazily only when destination array is not yet initialized and we anyway append data to the array in the following code block. Creating BigArray with initial shape of appending part will result in destination array being longer than neccessary by first-appended-chunk length with this-way-introduced extra header reading as all zeros. Fix it. /cc @Tyagov
-
- 26 Jun, 2015 1 commit
-
-
Ivan Tyagov authored
source_section -> source destination_section -> destination
-
- 25 Jun, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 24 Jun, 2015 3 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Stop use \n character as ingestion delimiter, only use for .CSV format where it's part of the structure of a file.
-
- 22 Jun, 2015 4 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 10 Jun, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 09 Jun, 2015 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 05 Jun, 2015 4 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Clean up iterate script implementation. No need of own array property type when we can use more generic object.
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Add new method that can copy CSV data to a Zbig Array.
-
- 04 Jun, 2015 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Add a new property of type array.
-
- 02 Jun, 2015 6 commits
-
-
Ivan Tyagov authored
Better end condition.
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Allow in API to pass reference of object (i.e. Data Array) where transformation is expected to be stored.
-
- 29 May, 2015 3 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
Ivan Tyagov authored
Add a generic implementation of a script able to iterate effectively over a Data Stream and do transformation on data itself.
-
- 28 May, 2015 2 commits
-
-
Ivan Tyagov authored
-
Ivan Tyagov authored
-
- 27 May, 2015 3 commits
-
-
Kirill Smelkov authored
-
Ivan Tyagov authored
Wendelin is a Big Data platform based on ERP5. More information can be found at http://www.wendelin.io/ Current repository contains following: * bt5/ - contains the generic ERP5 Business Templates needed to setup Wendelin on top of ERP5 * product/ - Wendelin file system product * slapos/ - SlapOs setup recipe * tests/ - test definitions for Wendelin platform
-
Kirill Smelkov authored
Wendelin is a Big Data platform based on ERP5. More information can be found at http://www.wendelin.io/
-
- 17 Apr, 2015 1 commit
-
-
Ivan Tyagov authored
-
- 30 Mar, 2015 2 commits
-
-
Ivan Tyagov authored
indexeddb storage locally at browser side.
-
Ivan Tyagov authored
-