v4.0.1 Performance improvements when supplying your own RNG Thanks to @zdylag for the contribution Ref PR:


Minor release of go-tdigest This release exposes `ToBytes` that allows one to reuse an existing buffer instead of always allocating a new one via `AsBytes`.


Major release of go-tdigest This release brings in some API and internal changes, but behavior should remain the same. CHANGES * A local RNG is used by default, instead of the global one * Internal counters were upgraded to 64bit integers * (Cosmetic) Configuring Compression uses a float64 instead of uint32


Minor release of go-tdigest This release introduces support for configuring the deserialized digest when using [tdigest.FromBytes][docs] Right now this is mostly useful for configuring the RNG the digest will use, for example: ```go t1, err := tdigest.FromBytes(payload, tdigest.LocalRandomNumberGenerator(42)) ``` [docs]:


Patch release of go-tdigest This patch contains a fix for code introduced on v2.0.0 where deserializing a payload would cause errors due to unexpected nil pointers.


Minor release of go-tdigest This release contains several performance oriented patches and was made possible mostly by @vmihailenco adding benchmarks, optimizing existing code and backporting improvements in the @honeycombio fork written by @ianwilkes Notably, the optimizations introduced on v2.0.0 by using a fenwick tree to cache prefix sums have been refined (#23) and, later, removed (#25) after the introduction of more thorough benchmarks. New Public API ============== * TDigest instances may now be duplicated via [Clone()][clone] * You may inspect the compression any given digest has been configured to use by calling [Compression()][compression] * A tdigest instance may be reused/reinitialized (to minimize allocations, generally) directly from a buffer via [FromBytes(bytes)][frombytes] * You may now opt to destroy a given digest when merging for the sake of performance: `t1.Merge(t2)` may be replaced by [MergeDestructive][merge] for faster execution, but you must make sure to not use `t2` after this since its state is going to be seriously invalid. [clone]: [compression]: [frombytes]: [merge]: Dependency Changes ================== We don't import `yourbasic/fenwick` anymore, so this library now doesn't require any external dependency to be used (test dependencies remain unchanged). Other ===== There's been some discussion about performance and chaging counts to 64bit at #20 - Much of what has been discussed is done and the next changes will likely require an API change, so this might be one of the last v2 releases, but keep in mind that the migration to the future v3 should be really simple; Even easier than the v1 to v2 path.


v2.1.0: Minor release of go-tdigest This release adds support for [TrimmedMean][1] (#22) and reduces inaccuracies caused by floating point math (#19). [1]: Many thanks to @mcbridne and @vmihailenco for making this happen.


Patch release of go-tdigest The previous release (v2.0.1) attempted to fix #17 but was insufficient. This release contains PR #18, which properly fixes the CDF estimation on the right extreme. Thanks to @zeebo for reporting, investigating and fixing!


Patch release of go-tdigest This release fixes an issue where we would wrongly estimate the CDF when reaching the end of the summary (Issue #17)


Major release of go-tdigest This release contains major API changes and significant performance improvements to the tdigest package. All users are encouraged to upgrade. Performance Improvements ======================== The critical path of this library (adding samples to the digest) has been drastically sped up by making use of a [binary indexed tree][bit] so that prefix sums and updates don't have to necessarily scan most of the storage. Results from `benchcmp` in a late 2013 MacAir (running Linux): benchmark old ns/op new ns/op delta BenchmarkAdd1-4 187 206 +10.16% BenchmarkAdd10-4 332 274 -17.47% BenchmarkAdd100-4 1092 325 -70.24 [bit]: Additionally, it's possible now to create a digest that uses a custom random number generator, which means that if you were suffering from lock contention (due to heavy usage of the shared rng), you can easily enable more speed gains by creating your digests with: ```go digest := tdigest.New( tdigest.Compression(200), tdigest.LocalRandomNumberGenerator(), ) ``` API Changes =========== The tdigest API has been drastically simplified with the goal of making it more readily usable without requiring people to read up and understand what, for example, compression means. Modifications ------------- - The `Add(float64,uint32)` method has been renamed to `AddWeighted` Additions --------- - Construction is now done via `New()` which accepts configuration parameters while providing sane defaults - There is a new `Add(float64)` method that works as a shortcut for `AddWeighted(float64,1)` - The `Count()` method has been introduced to allow users to decide what to do when the digest grows too much - The `CDF(float64)` method has been added. It stands for [cumulative distribution function][cdf] and it's useful for asking the inverse of the question asked via `Quantile(x)`: it answers at which fraction (quantile) of the data all seen samples are less than or equal to the given `x`. [cdf]: Removals -------- - There is no `Len()` method anymore since it provided no real actionable information - `New(float64)` doesn't exist anymore, it's been replaced by a simpler `New()` one External Dependencies ===================== Two dependencies have been introduced (v1.x had zero): - [yourbasic/fenwick][fen], used to speed up prefix sum computations allowing major performance improvements - (**test only**) [leesper/go_rng][rng], for generating non-uniform distributions to assist with testing Other changes ============= - This project now uses [dep][] for dependency management - A single digest can be used to summarize more than 4B data points - We now have [contribution guidelines][contrib] :-) [contrib]: [dep]: [fen]: [rng]:


Patch release of go-tdigest This release fixes serious issues with the tdigest package, namely: data with high variance could cause very erroneous estimations for specific quantiles. More details in PR #11 and Issue #12 Minor changes: - TestMerge is now slower, use the `-short` flag if you want to skip it Thanks to @christineyen for thoughts and investigation


Patch release of go-tdigest This release fixes an issue where `Quantile` would return crazy numbers when the number of samples in the digest is too small (PR #11).


Patch release of go-tdigest This release contains minor bugfixes with regards to internal error handling and has overall better error reporting. No notable changes.


Minor release of go-tdigest All previous releases contained a bug that would hurt accuracy whenever compression was triggered. Notable changes since v1.0.0: * New functions Len() and ForEachCentroid (PR #7) * Fixed serious ever increasing count accuracy bug (PR #10) Thanks to @ajgillis and @ianwilkes for their contributions.


First stable release of the tdigest package I'm finally happy with the performance and the UI so I'm considering this ready to be used in production. From now on releases will follow proper semantic versioning. Noticeable changes since v0.3.0: * More performance improvements * Renamed `Percentile` to `Quantile`


Third unstable release of the tdigest package. Noticeable changes: * Massive performance improvements by moving away from a tree structure to slices for centroid storage * (De)Serialization fully compatible with the Java version * Renamed method `Update()` to `Add()` for consistency's sake


Second unstable release. Noticeable changes: * (De)serialization now doesn't panic(), it forces handling of the error by the consumer * Fixed a major goroutine leakage (Issue #1)


First unstable release of go-tdigest All goals from initial roadmap are implemented: * Basic functionality for computing quantiles * Java-compatible serialization * Support for merging t-digests