* Mon Aug 13 2018 mcepl@suse.com
- Remove dependency on unittest2
* Wed May 09 2018 toddrme2178@gmail.com
- Use license tag
* Thu Apr 19 2018 toddrme2178@gmail.com
- Update to 1.2.1
* C-Blosc internal sources updated to 1.14.3. This basically means that
internal Zstd sources are bumped to 1.3.4, which may lead to noticeable
improved speeds (specially for low compression ratios).
* `np.datetime64` and other scalar objects that have `__getitem__()` are now
supported in _eval_blocks(). PR #377. Thanks to apalepu23.
* Vendored cpuinfo.py updated to 4.0.0 (ARM aarch64 is recognized now).
* Allow setup.py to work even if not on Intel or ARM or PPC archs are found.
- Update to 1.2.0
* Support for Python <= 2.6 or Python <= 3.4 has been deprecated.
* C-Blosc internal sources updated to 1.14.2. Using a C-Blosc library
> 1.14 is important for forward compatibility. For more info see:
http://blosc.org/posts/new-forward-compat-policy/
* Wed Aug 09 2017 toddrme2178@gmail.com
- Only works on x86-like CPUs
* Wed May 10 2017 toddrme2178@gmail.com
- Implement single-spec version.
- Fix source URL.
- Update to 1.1.2
* Updated setup.py to include Zstd codec in Blosc. Fixes #331.
- Update to 1.1.1
* Allow to delete all the columns in a ctable. Fixes #306.
* Double-check the value of a column that is being overwritten. Fixes
[#307].
* Use `pkg_resources.parse_version()` to test for version of packages.
Fixes #322.
* Now all the columns in a ctable are enforced to be a carray instance
in order to simplify the internal logic for handling columns.
* Now, the cparams are preserved during column replacement, e.g.:
`ct['f0'] = x + 1`
will continue to use the same cparams than the original column.
* C-Blosc updated to 1.11.2.
* Added a new `defaults_ctx` context so that users can select defaults
easily without changing global behaviour. For example::
with bcolz.defaults_ctx(vm="python", cparams=bcolz.cparams(clevel=0)):
cout = bcolz.eval("(x + 1) < 0")
* Fixed a crash occurring in `ctable.todataframe()` when both `columns`
and `orient='columns'` were specified. PR #311. Thanks to Peter
Quackenbush.
- Update to 1.1.0
Highlights:
* Much improved performance of ctable.where() and ctable.whereblocks().
Now bcolz is getting closer than ever to fundamental memory limits
during queries (see the updated benchmarks in the data containers
tutorial below).
* Better support for Dask; i.e. GIL is released during Blosc operation
when bcolz is called from a multithreaded app (like Dask). Also, Dask
can be used as another virtual machine for evaluating expressions (so
now it is possible to use it during queries too).
* New ctable.fetchwhere() method for getting the rows fulfilling some
condition in one go.
* New quantize filter for allowing lossy compression of floating point
data.
* It is possible to create ctables with more than 255 columns now.
Thanks to Skipper Seabold.
* The defaults during carray creation are scalars now. That allows to
create highly dimensional data containers more efficiently.
* carray object does implement the __array__() special method now. With
this, interoperability with numpy arrays is easier and faster.
- update to version 1.0.0:
* New version of embedded C-Blosc (bumped to 1.8.1). This allows for
using recent C-Blosc features like the BITSHUFFLE filter that
generally allows for better compression ratios at the expense of
some slowdown. Look into the carray tutorial on how to use the new
BITSHUFFLE filter.
* Use the -O1 flag for compiling the included C-Blosc sources on
Linux. This represents slower performance, but fixes nasty
segfaults as can be seen in issue #110 of python-blosc. Also, it
prints a warning for using an external C-Blosc library.
* Improved support for operations with carrays of shape (N, 1). PR
[#296]. Fixes #165 and #295. Thanks to Kevin Murray.
* Check that column exists before inserting a new one in a ctable
via __setitem__. If it exists, the existing column is
overwritten. Fixes #291.
* Some optimisations have been made within carray.__getitem__ to
improve performance when extracting a slice of data from a
carray. This is particularly relevant when running some
computation chunk-by-chunk over a large carray. (#283 @alimanfoo).
* Thu Mar 10 2016 toddrme2178@gmail.com
- update to version 0.12.1:
* setup.py now defers operations requiring numpy and Cython until
after those modules have been installed by setuptools. This means
that users no longer need to pre-install numpy and Cython to
install bcolz.
- update to version 0.12.0:
* Fixes an installation glitch for Windows. (#268 @cgohlke).
* The tutorial is now a Jupyter notebook. (#261 @FrancescElies).
* Replaces numpy float string specifier in test with
numpy.longdouble (#271 @msarahan).
* Fix for allowing the use of variables of type string in eval() and
other queries. (#273, @FrancescAlted).
* The size of the tables during import/export to HDF5 are honored
now via the expectedlen (bcolz) and expectedrows (PyTables)
parameters (@FrancescAlted).
* Update only the valid part of the last chunk during boolean
assignments. Fixes a VisibleDeprecationWarning with NumPy 1.10
(@FrancescAlted).
* More consistent string-type checking to allow use of unicode
strings in Python 2 for queries, column selection, etc. (#274
@BrenBarn).
* Installation no longer fails when listed as dependency of project
installed via setup.py develop or setup.py install. (#280 @mindw,
fixes #277).
* Paver setup has been deprecated (see #275).
- Update to version 0.11.4
- The .pyx extension is not packed using the absolute path anymore.
(#266 @FrancescAlted)
- Update to version 0.11.3
- Implement feature #255 bcolz.zeros can create new ctables too, either
empty or filled with zeros. (#256 @FrancescElies @FrancescAlted)
- Update to version 0.11.2
- Changed the `setuptools>18.3` dependency to `setuptools>18.0` because
Anaconda does not have `setuptools > 18.1` yet.
- Update to version 0.11.1
- Do not try to flush when a ctable is opened in 'r'ead-only mode.
See issue #252.
- Added the mock dependency for Python2.
- Added a `setuptools>18.3` dependency.
- Several fixes in the tutorial (Francesc Elies).
- Update to version 0.11.0
- Added support for appending a np.void to ctable objects
(closes ticket #229 @eumiro)
- Do not try to flush when an carray is opened in 'r'ead-only mode.
(closes #241 @FrancescAlted).
- Fix appending of object arrays to already existing carrays
(closes #243 @cpcloud)
- Great modernization of setup.py by using new versioning and many
other improvements (PR #239 @mindw).
- update to version 0.10.0:
* Fix pickle for in-memory carrays. (#193 #194 @dataisle @esc)
* Implement chunks iterator, which allows the following syntax for
chunk_ in ca._chunks, added "internal use" indicator to carray
chunks att ribute. (#153 @FrancescElies and @esc)
* Fix a memory leak and avoid copy in chunk.getudata. (#201 #202
@esc)
* Fix the error message when trying to open a fresh ctable in an
existing rootdir. (#191 @twiecki @esc)
* Solve #22 and be more specific about carray private methods. (#209
@FrancescElies @FrancescAlted)
* Implement context manager for carray and ctable. (#135 #210
@FrancescElies and @esc)
* Fix handling and API for leftovers. (#72 #132 #211 #213
@FrancescElies @esc)
* Fix bug for incorrect leftover value. (#208 @waylonflinn)
* Documentation: document how to write extensions, update docstrings
and mention the with statement / context manager. (#214
@FrancescElies)
* Various refactorings and cleanups. (#190 #198 #197 #199 #200)
* Fix bug creating carrays from transposed arrays without explicit
dtype. (#217 #218 @sdvillal)
* Tue May 19 2015 toddrme2178@gmail.com
- Update to version 0.9.0
* Implementing 'addcol' and 'delcol' to properly handle on-disk
tables.
+ 'addcol' now has a new keyword argument 'move' that allows
you to specify if you want to move or copy the data.
+ 'delcol' has a new keyword argument 'keep' which allows you
preserve the data on disk when removing a column.
+ Additionally, ctable now supports an 'auto_flush' keyword
that makes it flush to disk automatically after any methods
that may write data.
* Keep the GIL while calling Blosc compress and decompress in
order to support lock-free operation of newer Blosc versions
(1.5.x and beyond) that no longer have a global state.
* Distribute the 'carray_ext.pxd' as part of the package via
PyPi to ease building applications on bcolz, for example
* bquery*.
* The Sphinx based API documentation is now autogenerated from
the docstrings in the Python sources.
- update to version 0.8.1:
* Downgrade to Blosc v1.4.1 (#144 @esc)
* Fix license include (#143 @esc)
* Upgrade to Cython 0.22 (#145 @esc)
- update to version 0.8.0:
* Public API for carray (#98 @FrancescElies and #esc)
* A Cython definition file carrat_ext.pxd was added that contains
the definitions for the carray, chunks and chunk classes. This was
done to allow more complex programs to be built on the compressed
container primitives provided by bcolz.
* Overhaul the release procedure
* Other miscellaneous fixes and improvements
- changes from version 0.7.3:
* Update to Blosc v1.5.2
* Added support for pickling persistent carray/ctable
objects. Basically, what is serialized is the rootdir so the data
is still sitting on disk and the original contents in rootdir are
still needed for unpickling. (#79 @mrocklin)
* Fixed repr-ing of datetime64 carray objects (#99 @cpcloud)
* Fixed Unicode handling for column addressing (#91 @CarstVaartjes)
* Conda recipe and .binstar.yml (#88 @mrocklin and @cpcloud)
* Removed unittest2 as a run-time dependency (#90 @mrocklin)
* Various typo fixes. (#75 @talumbau, #86 @catawbasam and #83
@bgrant)
* Other miscellaneous fixes and improvements
* Mon Oct 13 2014 toddrme2178@gmail.com
- Update to 0.7.2
* Bugfix release
* Thu Sep 04 2014 toddrme2178@gmail.com
- Inital version