Change log#
Best viewed here.
jax 0.4.21#
jaxlib 0.4.21#
jax 0.4.20 (Nov 2, 2023)#
jaxlib 0.4.20 (Nov 2, 2023)#
Bug fixes
Fixed some type confusion between E4M3 and E5M2 float8 types.
jax 0.4.19 (Oct 19, 2023)#
New Features
Added
jax.typing.DTypeLike, which can be used to annotate objects that are convertible to JAX dtypes.Added
jax.numpy.fill_diagonal.
Changes
JAX now requires SciPy 1.9 or newer.
Bug fixes
Only process 0 in a multicontroller distributed JAX program will write persistent compilation cache entries. This fixes write contention if the cache is placed on a network filesystem such as GCS.
The version check for cusolver and cufft no longer considers the patch versions when determining if the installed version of these libraries is at least as new as the versions against which JAX was built.
jaxlib 0.4.19 (Oct 19, 2023)#
Changes
jaxlib will now always prefer pip-installed NVIDIA CUDA libraries (nvidia-… packages) over any other CUDA installation if they are installed, including installations named in
LD_LIBRARY_PATH. If this causes problems and the intent is to use a system-installed CUDA, the fix is to remove the pip installed CUDA library packages.
jax 0.4.18 (Oct 6, 2023)#
jaxlib 0.4.18 (Oct 6, 2023)#
Changes
CUDA jaxlibs now depend on the user to install a compatible NCCL version. If using the recommended
cuda12_pipinstallation, NCCL should be installed automatically. Currently, NCCL 2.16 or newer is required.We now provide Linux aarch64 wheels, both with and without NVIDIA GPU support.
Deprecations
A number of internal utilities and inadvertent exports in
jax.laxhave been deprecated, and will be removed in a future release.jax.lax.dtypes: usejax.dtypesinstead.jax.lax.itertools: useitertoolsinstead.naryop,naryop_dtype_rule,standard_abstract_eval,standard_naryop,standard_primitive,standard_unop,unop, andunop_dtype_ruleare internal utilities, now deprecated without replacement.
Bug fixes
Fixed Cloud TPU regression where compilation would OOM due to smem.
jax 0.4.17 (Oct 3, 2023)#
New features
Added new
jax.numpy.bitwise_count()function, matching the API of the simlar function recently added to NumPy.
Deprecations
Removed the deprecated module
jax.abstract_arraysand all its contents.Named key constructors in
jax.randomare deprecated. Pass theimplargument tojax.random.PRNGKey()orjax.random.key()instead:random.threefry2x32_key(seed)becomesrandom.PRNGKey(seed, impl='threefry2x32')random.rbg_key(seed)becomesrandom.PRNGKey(seed, impl='rbg')random.unsafe_rbg_key(seed)becomesrandom.PRNGKey(seed, impl='unsafe_rbg')
Changes:
CUDA: JAX now verifies that the CUDA libraries it finds are at least as new as the CUDA libraries that JAX was built against. If older libraries are found, JAX raises an exception since that is preferable to mysterious failures and crashes.
Removed the “No GPU/TPU” found warning. Instead warn if, on Linux, an NVIDIA GPU or a Google TPU are found but not used and
--jax_platformswas not specified.jax.scipy.stats.mode()now returns a 0 count if the mode is taken across a size-0 axis, matching the behavior ofscipy.stats.modein SciPy 1.11.Most
jax.numpyfunctions and attributes now have fully-defined type stubs. Previously many of these were treated asAnyby static type checkers likemypyandpytype.
jaxlib 0.4.17 (Oct 3, 2023)#
Changes:
Python 3.12 wheels were added in this release.
The CUDA 12 wheels now require CUDA 12.2 or newer and cuDNN 8.9.4 or newer.
Bug fixes:
Fixed log spam from ABSL when the JAX CPU backend was initialized.
jax 0.4.16 (Sept 18, 2023)#
Changes
Added
jax.numpy.ufunc, as well asjax.numpy.frompyfunc(), which can convert any scalar-valued function into anumpy.ufunc()-like object, with methods such asouter(),reduce(),accumulate(),at(), andreduceat()(#17054).When not running under IPython: when an exception is raised, JAX now filters out the entirety of its internal frames from tracebacks. (Without the “unfiltered stack trace” that previously appeared.) This should produce much friendlier-looking tracebacks. See here for an example. This behavior can be changed by setting
JAX_TRACEBACK_FILTERING=remove_frames(for two separate unfiltered/filtered tracebacks, which was the old behavior) orJAX_TRACEBACK_FILTERING=off(for one unfiltered traceback).jax2tf default serialization version is now 7, which introduces new shape safety assertions.
Devices passed to
jax.sharding.Meshshould be hashable. This specifically applies to mock devices or user created devices.jax.devices()are already hashable.
Breaking changes:
jax2tf now uses native serialization by default. See the jax2tf documentation for details and for mechanisms to override the default.
The option
--jax_coordination_servicehas been removed. It is now alwaysTrue.jax.jaxpr_utilhas been removed from the public JAX namespace.JAX_USE_PJRT_C_API_ON_TPUno longer has an effect (i.e. it always defaults to true).The backwards compatibility flag
--jax_host_callback_ad_transformsintroduced in December 2021, has been removed.
Deprecations:
Several
jax.numpyAPIs have been deprecated following NumPy NEP-52:jax.numpy.NINFhas been deprecated. Use-jax.numpy.infinstead.jax.numpy.PZEROhas been deprecated. Use0.0instead.jax.numpy.NZEROhas been deprecated. Use-0.0instead.jax.numpy.issubsctype(x, t)has been deprecated. Usejax.numpy.issubdtype(x.dtype, t).jax.numpy.row_stackhas been deprecated. Usejax.numpy.vstackinstead.jax.numpy.in1dhas been deprecated. Usejax.numpy.isininstead.jax.numpy.trapzhas been deprecated. Usejax.scipy.integrate.trapezoidinstead.
jax.scipy.linalg.trilandjax.scipy.linalg.triuhave been deprecated, following SciPy. Usejax.numpy.trilandjax.numpy.triuinstead.jax.lax.prodhas been removed after being deprecated in JAX v0.4.11. Use the built-inmath.prodinstead.A number of exports from
jax.interpreters.xlarelated to defining HLO lowering rules for custom JAX primitives have been deprecated. Custom primitives should be defined using the StableHLO lowering utilities injax.interpreters.mlirinstead.The following previously-deprecated functions have been removed after a three-month deprecation period:
jax.abstract_arrays.ShapedArray: usejax.core.ShapedArray.jax.abstract_arrays.raise_to_shaped: usejax.core.raise_to_shaped.jax.numpy.alltrue: usejax.numpy.all.jax.numpy.sometrue: usejax.numpy.any.jax.numpy.product: usejax.numpy.prod.jax.numpy.cumproduct: usejax.numpy.cumprod.
Deprecations/removals:
The internal submodule
jax.prngis now deprecated. Its contents are available atjax.extend.random.The internal submodule path
jax.linear_utilhas been deprecated. Usejax.extend.linear_utilinstead (Part of jax.extend: a module for extensions)jax.random.PRNGKeyArrayandjax.random.KeyArrayare deprecated. Usejax.Arrayfor type annotations, andjax.dtypes.issubdtype(arr.dtype, jax.dtypes.prng_key)for runtime detection of typed prng keys.The method
PRNGKeyArray.unsafe_raw_arrayis deprecated. Usejax.random.key_data()instead.jax.experimental.pjit.with_sharding_constraintis deprecated. Usejax.lax.with_sharding_constraintinstead.The internal utilities
jax.core.is_opaque_dtypeandjax.core.has_opaque_dtypehave been removed. Opaque dtypes have been renamed to Extended dtypes; usejnp.issubdtype(dtype, jax.dtypes.extended)instead (available since jax v0.4.14).The utility
jax.interpreters.xla.register_collective_primitivehas been removed. This utility did nothing useful in recent JAX releases and calls to it can be safely removed.The internal submodule path
jax.linear_utilhas been deprecated. Usejax.extend.linear_utilinstead (Part of jax.extend: a module for extensions)
jaxlib 0.4.16 (Sept 18, 2023)#
Changes:
Sparse CSR matrix multiplications via the experimental jax sparse APIs no longer uses a deterministic algorithm on NVIDIA GPUs. This change was made to improve compatibility with CUDA 12.2.1.
Bug fixes:
Fixed a crash on Windows due to a fatal LLVM error related to out-of-order sections and IMAGE_REL_AMD64_ADDR32NB relocations (https://github.com/openxla/xla/commit/cb732a921f0c4184995cbed82394931011d12bd4).
jax 0.4.14 (July 27, 2023)#
Changes
jax.jittakesdonate_argnamesas an argument. It’s semantics are similar tostatic_argnames. If neither donate_argnums nor donate_argnames is provided, no arguments are donated. If donate_argnums is not provided but donate_argnames is, or vice versa, JAX usesinspect.signature(fun)to find any positional arguments that correspond to donate_argnames (or vice versa). If both donate_argnums and donate_argnames are provided, inspect.signature is not used, and only actual parameters listed in either donate_argnums or donate_argnames will be donated.jax.random.gamma()has been re-factored to a more efficient algorithm with more robust endpoint behavior (#16779). This means that the sequence of values returned for a givenkeywill change between JAX v0.4.13 and v0.4.14 forgammaand related samplers (includingjax.random.ball(),jax.random.beta(),jax.random.chisquare(),jax.random.dirichlet(),jax.random.generalized_normal(),jax.random.loggamma(),jax.random.t()).
Deletions
in_axis_resourcesandout_axis_resourceshave been deleted from pjit since it has been more than 3 months since their deprecation. Please usein_shardingsandout_shardingsas the replacement. This is a safe and trivial name replacement. It does not change any of the current pjit semantics and doesn’t break any code. You can still pass inPartitionSpecsto in_shardings and out_shardings.
Deprecations
Python 3.8 support has been dropped as per https://jax.readthedocs.io/en/latest/deprecation.html
JAX now requires NumPy 1.22 or newer as per https://jax.readthedocs.io/en/latest/deprecation.html
Passing optional arguments to
jax.numpy.ndarray.at()by position is no longer supported, after being deprecated in JAX version 0.4.7. For example, instead ofx.at[i].get(True), usex.at[i].get(indices_are_sorted=True)The following
jax.Arraymethods have been removed, after being deprecated in JAX v0.4.5:jax.Array.broadcast: usejax.lax.broadcast()instead.jax.Array.broadcast_in_dim: usejax.lax.broadcast_in_dim()instead.jax.Array.split: usejax.numpy.split()instead.
The following APIs have been removed after previous deprecation:
jax.ad: usejax.interpreters.ad.jax.curry: usecurry = lambda f: partial(partial, f).jax.partial_eval: usejax.interpreters.partial_eval.jax.pxla: usejax.interpreters.pxla.jax.xla: usejax.interpreters.xla.jax.ShapedArray: usejax.core.ShapedArray.jax.interpreters.pxla.device_put: usejax.device_put().jax.interpreters.pxla.make_sharded_device_array: usejax.make_array_from_single_device_arrays().jax.interpreters.pxla.ShardedDeviceArray: usejax.Array.jax.numpy.DeviceArray: usejax.Array.jax.stages.Compiled.compiler_ir: usejax.stages.Compiled.as_text().
Breaking changes
JAX now requires ml_dtypes version 0.2.0 or newer.
To fix a corner case, calls to
jax.lax.cond()with five arguments will always resolve to the “common operands”condbehavior (as documented) if the second and third arguments are callable, even if other operands are callable as well. See #16413.The deprecated config options
jax_arrayandjax_jit_pjit_api_merge, which did nothing, have been removed. These options have been true by default for many releases.
New features
JAX now supports a configuration flag –jax_serialization_version and a JAX_SERIALIZATION_VERSION environment variable to control the serialization version (#16746).
jax2tf in presence of shape polymorphism now generates code that checks certain shape constraints, if the serialization version is at least 7. See https://github.com/google/jax/blob/main/jax/experimental/jax2tf/README.md#errors-in-presence-of-shape-polymorphism.
jaxlib 0.4.14 (July 27, 2023)#
Deprecations
Python 3.8 support has been dropped as per https://jax.readthedocs.io/en/latest/deprecation.html
jax 0.4.13 (June 22, 2023)#
Changes
jax.jitnow allowsNoneto be passed toin_shardingsandout_shardings. The semantics are as follows:For in_shardings, JAX will mark is as replicated but this behavior can change in the future.
For out_shardings, we will rely on the XLA GSPMD partitioner to determine the output shardings.
jax.experimental.pjit.pjitalso allowsNoneto be passed toin_shardingsandout_shardings. The semantics are as follows:If the mesh context manager is not provided, JAX has the freedom to choose whatever sharding it wants.
For in_shardings, JAX will mark is as replicated but this behavior can change in the future.
For out_shardings, we will rely on the XLA GSPMD partitioner to determine the output shardings.
If the mesh context manager is provided, None will imply that the value will be replicated on all devices of the mesh.
Executable.cost_analysis() works on Cloud TPU
Added a warning if a non-allowlisted
jaxlibplugin is in use.Added
jax.tree_util.tree_leaves_with_path.Noneis not a valid input tojax.experimental.multihost_utils.host_local_array_to_global_arrayorjax.experimental.multihost_utils.global_array_to_host_local_array. Please usejax.sharding.PartitionSpec()if you wanted to replicate your input.
Bug fixes
Fixed incorrect wheel name in CUDA 12 releases (#16362); the correct wheel is named
cudnn89instead ofcudnn88.
Deprecations
The
native_serialization_strict_checksparameter tojax.experimental.jax2tf.convert()is deprecated in favor of the newnative_serializaation_disabled_checks(#16347).
jaxlib 0.4.13 (June 22, 2023)#
Changes
Added Windows CPU-only wheels to the
jaxlibPypi release.
Bug fixes
__cuda_array_interface__was broken in previous jaxlib versions and is now fixed (#16440).Concurrent CUDA kernel tracing is now enabled by default on NVIDIA GPUs.
jax 0.4.12 (June 8, 2023)#
Changes
Deprecations
jax.abstract_arraysand its contents are now deprecated. See related functionality in :mod:jax.core.jax.numpy.alltrue: usejax.numpy.all. This follows the deprecation ofnumpy.alltruein NumPy version 1.25.0.jax.numpy.sometrue: usejax.numpy.any. This follows the deprecation ofnumpy.sometruein NumPy version 1.25.0.jax.numpy.product: usejax.numpy.prod. This follows the deprecation ofnumpy.productin NumPy version 1.25.0.jax.numpy.cumproduct: usejax.numpy.cumprod. This follows the deprecation ofnumpy.cumproductin NumPy version 1.25.0.jax.sharding.OpShardingShardinghas been removed since it has been 3 months since it was deprecated.
jaxlib 0.4.12 (June 8, 2023)#
Changes
Includes PTX/SASS for Hopper (SM version 9.0+) GPUs. Previous versions of jaxlib should work on Hopper but would have a long JIT-compilation delay the first time a JAX operation was executed.
Bug fixes
Fixes incorrect source line information in JAX-generated Python tracebacks under Python 3.11.
Fixes crash when printing local variables of frames in JAX-generated Python tracebacks (#16027).
jax 0.4.11 (May 31, 2023)#
Deprecations
The following APIs have been removed after a 3 month deprecation period, in accordance with the API compatibility policy:
jax.experimental.PartitionSpec: usejax.sharding.PartitionSpec.jax.experimental.maps.Mesh: usejax.sharding.Meshjax.experimental.pjit.NamedSharding: usejax.sharding.NamedSharding.jax.experimental.pjit.PartitionSpec: usejax.sharding.PartitionSpec.jax.experimental.pjit.FROM_GDA. Instead pass shardedjax.Arrayobjects as input and remove the optionalin_shardingsargument topjit.jax.interpreters.pxla.PartitionSpec: usejax.sharding.PartitionSpec.jax.interpreters.pxla.Mesh: usejax.sharding.Meshjax.interpreters.xla.Buffer: usejax.Array.jax.interpreters.xla.Device: usejax.Device.jax.interpreters.xla.DeviceArray: usejax.Array.jax.interpreters.xla.device_put: usejax.device_put.jax.interpreters.xla.xla_call_p: usejax.experimental.pjit.pjit_p.axis_resourcesargument ofwith_sharding_constraintis removed. Please useshardingsinstead.
jaxlib 0.4.11 (May 31, 2023)#
Changes
Added
memory_stats()method toDevices. If supported, this returns a dict of string stat names with int values, e.g."bytes_in_use", or None if the platform doesn’t support memory statistics. The exact stats returned may vary across platforms. Currently only implemented on Cloud TPU.Readded support for the Python buffer protocol (
memoryview) on CPU devices.
jax 0.4.10 (May 11, 2023)#
jaxlib 0.4.10 (May 11, 2023)#
Changes
Fixed
'apple-m1' is not a recognized processor for this target (ignoring processor)issue that prevented previous release from running on Mac M1.
jax 0.4.9 (May 9, 2023)#
Changes
The flags experimental_cpp_jit, experimental_cpp_pjit and experimental_cpp_pmap have been removed. They are now always on.
Accuracy of singular value decomposition (SVD) on TPU has been improved (requires jaxlib 0.4.9).
Deprecations
jax.experimental.gda_serializationis deprecated and has been renamed tojax.experimental.array_serialization. Please change your imports to usejax.experimental.array_serialization.The
in_axis_resourcesandout_axis_resourcesarguments of pjit have been deprecated. Please usein_shardingsandout_shardingsrespectively.The function
jax.numpy.msorthas been removed. It has been deprecated since JAX v0.4.1. Usejnp.sort(a, axis=0)instead.in_partsandout_partsarguments have been removed fromjax.xla_computationsince they were only used with sharded_jit and sharded_jit is long gone.instantiate_const_outputsargument has been removed fromjax.xla_computationsince it has been unused for a very long time.
jaxlib 0.4.9 (May 9, 2023)#
jax 0.4.8 (March 29, 2023)#
Breaking changes
A major component of the Cloud TPU runtime has been upgraded. This enables the following new features on Cloud TPU:
jax.debug.print(),jax.debug.callback(), andjax.debug.breakpoint()now work on Cloud TPUAutomatic TPU memory defragmentation
jax.experimental.host_callback()is no longer supported on Cloud TPU with the new runtime component. Please file an issue on the JAX issue tracker if the newjax.debugAPIs are insufficient for your use case.The old runtime component will be available for at least the next three months by setting the environment variable
JAX_USE_PJRT_C_API_ON_TPU=false. If you find you need to disable the new runtime for any reason, please let us know on the JAX issue tracker.
Changes
The minimum jaxlib version has been bumped from 0.4.6 to 0.4.7.
Deprecations
CUDA 11.4 support has been dropped. JAX GPU wheels only support CUDA 11.8 and CUDA 12. Older CUDA versions may work if jaxlib is built from source.
global_arg_shapesargument of pmap only worked with sharded_jit and has been removed from pmap. Please migrate to pjit and remove global_arg_shapes from pmap.
jax 0.4.7 (March 27, 2023)#
Changes
As per https://jax.readthedocs.io/en/latest/jax_array_migration.html#jax-array-migration
jax.config.jax_arraycannot be disabled anymore.jax.config.jax_jit_pjit_api_mergecannot be disabled anymore.jax.experimental.jax2tf.convert()now supports thenative_serializationparameter to use JAX’s native lowering to StableHLO to obtain a StableHLO module for the entire JAX function instead of lowering each JAX primitive to a TensorFlow op. This simplifies the internals and increases the confidence that what you serialize matches the JAX native semantics. See documentation. As part of this change the config flag--jax2tf_default_experimental_native_loweringhas been renamed to--jax2tf_native_serialization.JAX now depends on
ml_dtypes, which contains definitions of NumPy types like bfloat16. These definitions were previously internal to JAX, but have been split into a separate package to facilitate sharing them with other projects.JAX now requires NumPy 1.21 or newer and SciPy 1.7 or newer.
Deprecations
The type
jax.numpy.DeviceArrayis deprecated. Usejax.Arrayinstead, for which it is an alias.The type
jax.interpreters.pxla.ShardedDeviceArrayis deprecated. Usejax.Arrayinstead.Passing additional arguments to
jax.numpy.ndarray.at()by position is deprecated. For example, instead ofx.at[i].get(True), usex.at[i].get(indices_are_sorted=True)jax.interpreters.xla.device_putis deprecated. Please usejax.device_put.jax.interpreters.pxla.device_putis deprecated. Please usejax.device_put.jax.experimental.pjit.FROM_GDAis deprecated. Please pass in sharded jax.Arrays as input and remove thein_shardingsargument to pjit since it is optional.
jaxlib 0.4.7 (March 27, 2023)#
Changes:
jaxlib now depends on
ml_dtypes, which contains definitions of NumPy types like bfloat16. These definitions were previously internal to JAX, but have been split into a separate package to facilitate sharing them with other projects.
jax 0.4.6 (Mar 9, 2023)#
Changes
jax.tree_utilnow contain a set of APIs that allow user to define keys for their custom pytree node. This includes:tree_flatten_with_paththat flattens a tree and return not only each leaf but also their key paths.tree_map_with_pathsthat can map a function that takes the key path as argument.`register_pytree_with_keys`` to register how the key path and leaves should looks like in a custom pytree node.
keystrthat pretty-prints a key path.
jax2tf.call_tf()has a new parameteroutput_shape_dtype(defaultNone) that can be used to declare the output shape and type of the result. This enablesjax2tf.call_tf()to work in the presence of shape polymorphism. (#14734).
Deprecations
The old key-path APIs in
jax.tree_utilare deprecated and will be removed 3 months from Mar 10 2023:register_keypaths: usejax.tree_util.register_pytree_with_keys()instead.AttributeKeyPathEntry: useGetAttrKeyinstead.GetitemKeyPathEntry: useSequenceKeyorDictKeyinstead.
jaxlib 0.4.6 (Mar 9, 2023)#
jax 0.4.5 (Mar 2, 2023)#
Deprecations
jax.sharding.OpShardingShardinghas been renamed tojax.sharding.GSPMDSharding.jax.sharding.OpShardingShardingwill be removed in 3 months from Feb 17, 2023.The following
jax.Arraymethods are deprecated and will be removed 3 months from Feb 23 2023:jax.Array.broadcast: usejax.lax.broadcast()instead.jax.Array.broadcast_in_dim: usejax.lax.broadcast_in_dim()instead.jax.Array.split: usejax.numpy.split()instead.
jax 0.4.4 (Feb 16, 2023)#
Changes
The implementation of
jitandpjithas been merged. Merging jit and pjit changes the internals of JAX without affecting the public API of JAX. Before,jitwas a final style primitive. Final style means that the creation of jaxpr was delayed as much as possible and transformations were stacked on top of each other. With thejit-pjitimplementation merge,jitbecomes an initial style primitive which means that we trace to jaxpr as early as possible. For more information see this section in autodidax. Moving to initial style should simplify JAX’s internals and make development of features like dynamic shapes, etc easier. You can disable it only via the environment variable i.e.os.environ['JAX_JIT_PJIT_API_MERGE'] = '0'. The merge must be disabled via an environment variable since it affects JAX at import time so it needs to be disabled before jax is imported.axis_resourcesargument ofwith_sharding_constraintis deprecated. Please useshardingsinstead. There is no change needed if you were usingaxis_resourcesas an arg. If you were using it as a kwarg, then please useshardingsinstead.axis_resourceswill be removed after 3 months from Feb 13, 2023.added the
jax.typingmodule, with tools for type annotations of JAX functions.The following names have been deprecated:
jax.xla.Deviceandjax.interpreters.xla.Device: usejax.Device.jax.experimental.maps.Mesh. Usejax.sharding.Meshinstead.jax.experimental.pjit.NamedSharding: usejax.sharding.NamedSharding.jax.experimental.pjit.PartitionSpec: usejax.sharding.PartitionSpec.jax.interpreters.pxla.Mesh: usejax.sharding.Mesh.jax.interpreters.pxla.PartitionSpec: usejax.sharding.PartitionSpec.
Breaking Changes
the
initialargument to reduction functions like :func:jax.numpy.sumis now required to be a scalar, consistent with the corresponding NumPy API. The previous behavior of broadcating the output against non-scalarinitialvalues was an unintentional implementation detail (#14446).
jaxlib 0.4.4 (Feb 16, 2023)#
Breaking changes
Support for NVIDIA Kepler series GPUs has been removed from the default
jaxlibbuilds. If Kepler support is needed, it is still possible to buildjaxlibfrom source with Kepler support (via the--cuda_compute_capabilities=sm_35option tobuild.py), however note that CUDA 12 has completely dropped support for Kepler GPUs.
jax 0.4.3 (Feb 8, 2023)#
Breaking changes
Deleted
jax.scipy.linalg.polar_unitary(), which was a deprecated JAX extension to the scipy API. Usejax.scipy.linalg.polar()instead.
Changes
Added
jax.scipy.stats.rankdata().
jaxlib 0.4.3 (Feb 8, 2023)#
jax.Arraynow has the non-blockingis_ready()method, which returnsTrueif the array is ready (see alsojax.block_until_ready()).
jax 0.4.2 (Jan 24, 2023)#
Breaking changes
Deleted
jax.experimental.callbackOperations with dimensions in presence of jax2tf shape polymorphism have been generalized to work in more scenarios, by converting the symbolic dimension to JAX arrays. Operations involving symbolic dimensions and
np.ndarraynow can raise errors when the result is used as a shape value (#14106).jaxpr objects now raise an error on attribute setting in order to avoid problematic mutations (#14102)
Changes
jax2tf.call_tf()has a new parameterhas_side_effects(defaultTrue) that can be used to declare whether an instance can be removed or replicated by JAX optimizations such as dead-code elimination (#13980).Added more support for floordiv and mod for jax2tf shape polymorphism. Previously, certain division operations resulted in errors in presence of symbolic dimensions (#14108).
jaxlib 0.4.2 (Jan 24, 2023)#
Changes
Set JAX_USE_PJRT_C_API_ON_TPU=1 to enable new Cloud TPU runtime, featuring automatic device memory defragmentation.
jax 0.4.1 (Dec 13, 2022)#
Changes
Support for Python 3.7 has been dropped, in accordance with JAX’s Python and NumPy version support policy.
We introduce
jax.Arraywhich is a unified array type that subsumesDeviceArray,ShardedDeviceArray, andGlobalDeviceArraytypes in JAX. Thejax.Arraytype helps make parallelism a core feature of JAX, simplifies and unifies JAX internals, and allows us to unifyjitandpjit.jax.Arrayhas been enabled by default in JAX 0.4 and makes some breaking change to thepjitAPI. The jax.Array migration guide can help you migrate your codebase tojax.Array. You can also look at the Distributed arrays and automatic parallelization tutorial to understand the new concepts.PartitionSpecandMeshare now out of experimental. The new API endpoints arejax.sharding.PartitionSpecandjax.sharding.Mesh.jax.experimental.maps.Meshandjax.experimental.PartitionSpecare deprecated and will be removed in 3 months.with_sharding_constraints new public endpoint isjax.lax.with_sharding_constraint.If using ABSL flags together with
jax.config, the ABSL flag values are no longer read or written after the JAX configuration options are initially populated from the ABSL flags. This change improves performance of readingjax.configoptions, which are used pervasively in JAX.The jax2tf.call_tf function now uses for TF lowering the first TF device of the same platform as used by the embedding JAX computation. Before, it was using the 0th device for the JAX-default backend.
A number of
jax.numpyfunctions now have their arguments marked as positional-only, matching NumPy.jnp.msortis now deprecated, following the deprecation ofnp.msortin numpy 1.24. It will be removed in a future release, in accordance with the API compatibility policy. It can be replaced withjnp.sort(a, axis=0).
jaxlib 0.4.1 (Dec 13, 2022)#
Changes
Support for Python 3.7 has been dropped, in accordance with JAX’s Python and NumPy version support policy.
The behavior of
XLA_PYTHON_CLIENT_MEM_FRACTION=.XXhas been changed to allocate XX% of the total GPU memory instead of the previous behavior of using currently available GPU memory to calculate preallocation. Please refer to GPU memory allocation for more details.The deprecated method
.block_host_until_ready()has been removed. Use.block_until_ready()instead.
jax 0.4.0 (Dec 12, 2022)#
The release was yanked.
jaxlib 0.4.0 (Dec 12, 2022)#
The release was yanked.
jax 0.3.25 (Nov 15, 2022)#
Changes
jax.numpy.linalg.pinv()now supports thehermitianoption.jax.scipy.linalg.hessenberg()is now supported on CPU only. Requires jaxlib > 0.3.24.New functions
jax.lax.linalg.hessenberg(),jax.lax.linalg.tridiagonal(), andjax.lax.linalg.householder_product()were added. Householder reduction is currently CPU-only and tridiagonal reductions are supported on CPU and GPU only.The gradients of
svdandjax.numpy.linalg.pinvare now computed more economically for non-square matrices.
Breaking Changes
Deleted the
jax_experimental_name_stackconfig option.Convert a string
axis_namesarguments to thejax.experimental.maps.Meshconstructor into a singleton tuple instead of unpacking the string into a sequence of character axis names.
jaxlib 0.3.25 (Nov 15, 2022)#
Changes
Added support for tridiagonal reductions on CPU and GPU.
Added support for upper Hessenberg reductions on CPU.
Bugs
Fixed a bug that meant that frames in tracebacks captured by JAX were incorrectly mapped to source lines under Python 3.10+
jax 0.3.24 (Nov 4, 2022)#
Changes
JAX should be faster to import. We now import scipy lazily, which accounted for a significant fraction of JAX’s import time.
Setting the env var
JAX_PERSISTENT_CACHE_MIN_COMPILE_TIME_SECS=$Ncan be used to limit the number of cache entries written to the persistent cache. By default, computations that take 1 second or more to compile will be cached.Added
jax.scipy.stats.mode().
The default device order used by
pmapon TPU if no order is specified now matchesjax.devices()for single-process jobs. Previously the two orderings differed, which could lead to unnecessary copies or out-of-memory errors. Requiring the orderings to agree simplifies matters.
Breaking Changes
jax.numpy.gradient()now behaves like most other functions injax.numpy, and forbids passing lists or tuples in place of arrays (#12958)Functions in
jax.numpy.linalgandjax.numpy.fftnow uniformly require inputs to be array-like: i.e. lists and tuples cannot be used in place of arrays. Part of #7737.
Deprecations
jax.sharding.MeshPspecShardinghas been renamed tojax.sharding.NamedSharding.jax.sharding.MeshPspecShardingname will be removed in 3 months.
jaxlib 0.3.24 (Nov 4, 2022)#
Changes
Buffer donation now works on CPU. This may break code that marked buffers for donation on CPU but relied on donation not being implemented.
jax 0.3.23 (Oct 12, 2022)#
Changes
Update Colab TPU driver version for new jaxlib release.
jax 0.3.22 (Oct 11, 2022)#
Changes
Add
JAX_PLATFORMS=tpu,cpuas default setting in TPU initialization, so JAX will raise an error if TPU cannot be initialized instead of falling back to CPU. SetJAX_PLATFORMS=''to override this behavior and automatically choose an available backend (the original default), or setJAX_PLATFORMS=cputo always use CPU regardless of if the TPU is available.
Deprecations
Several test utilities deprecated in JAX v0.3.8 are now removed from
jax.test_util.
jaxlib 0.3.22 (Oct 11, 2022)#
jax 0.3.21 (Sep 30, 2022)#
Changes
The persistent compilation cache will now warn instead of raising an exception on error (#12582), so program execution can continue if something goes wrong with the cache. Set
JAX_RAISE_PERSISTENT_CACHE_ERRORS=trueto revert this behavior.
jax 0.3.20 (Sep 28, 2022)#
jaxlib 0.3.20 (Sep 28, 2022)#
Bug fixes
Fixes support for limiting the visible CUDA devices via
jax_cuda_visible_devicesin distributed jobs. This functionality is needed for the JAX/SLURM integration on GPU (#12533).
jax 0.3.19 (Sep 27, 2022)#
Fixes required jaxlib version.
jax 0.3.18 (Sep 26, 2022)#
Changes
Ahead-of-time lowering and compilation functionality (tracked in #7733) is stable and public. See the overview and the API docs for
jax.stages.Introduced
jax.Array, intended to be used for bothisinstancechecks and type annotations for array types in JAX. Notice that this included some subtle changes to howisinstanceworks forjax.numpy.ndarrayfor jax-internal objects, asjax.numpy.ndarrayis now a simple alias ofjax.Array.
Breaking changes
jax._srcis no longer imported into the from the publicjaxnamespace. This may break users that were using JAX internals.jax.soft_pmaphas been deleted. Please usepjitorxmapinstead.jax.soft_pmapis undocumented. If it were documented, a deprecation period would have been provided.
jax 0.3.17 (Aug 31, 2022)#
Bugs
Fix corner case issue in gradient of
lax.powwith an exponent of zero (#12041)
Breaking changes
jax.checkpoint(), also known asjax.remat(), no longer supports theconcreteoption, following the previous version’s deprecation; see JEP 11830.
Changes
Added
jax.pure_callback()that enables calling back to pure Python functions from compiled functions (e.g. functions decorated withjax.jitorjax.pmap).
Deprecations:
The deprecated
DeviceArray.tile()method has been removed. Usejax.numpy.tile()(#11944).DeviceArray.to_py()has been deprecated. Usenp.asarray(x)instead.
jax 0.3.16#
Breaking changes
Support for NumPy 1.19 has been dropped, per the deprecation policy. Please upgrade to NumPy 1.20 or newer.
Changes
Added
jax.debugthat includes utilities for runtime value debugging such atjax.debug.print()andjax.debug.breakpoint().Added new documentation for runtime value debugging
Deprecations
jax.mask()jax.shapecheck()APIs have been removed. See #11557.jax.experimental.loopshas been removed. See #10278 for an alternative API.jax.tree_util.tree_multimap()has been removed. It has been deprecated since JAX release 0.3.5, andjax.tree_util.tree_map()is a direct replacement.Removed
jax.experimental.stax; it has long been a deprecated alias ofjax.example_libraries.stax.Removed
jax.experimental.optimizers; it has long been a deprecated alias ofjax.example_libraries.optimizers.jax.checkpoint(), also known asjax.remat(), has a new implementation switched on by default, meaning the old implementation is deprecated; see JEP 11830.
jax 0.3.15 (July 22, 2022)#
Changes
JaxTestCaseandJaxTestLoaderhave been removed fromjax.test_util. These classes have been deprecated since v0.3.1 (#11248).Added
jax.scipy.gaussian_kde(#11237).Binary operations between JAX arrays and built-in collections (
dict,list,set,tuple) now raise aTypeErrorin all cases. Previously some cases (particularly equality and inequality) would return boolean scalars inconsistent with similar operations in NumPy (#11234).Several
jax.tree_utilroutines accessed as top-level JAX package imports are now deprecated, and will be removed in a future JAX release in accordance with the API compatibility policy:jax.treedef_is_leaf()is deprecated in favor ofjax.tree_util.treedef_is_leaf()jax.tree_flatten()is deprecated in favor ofjax.tree_util.tree_flatten()jax.tree_leaves()is deprecated in favor ofjax.tree_util.tree_leaves()jax.tree_structure()is deprecated in favor ofjax.tree_util.tree_structure()jax.tree_transpose()is deprecated in favor ofjax.tree_util.tree_transpose()jax.tree_unflatten()is deprecated in favor ofjax.tree_util.tree_unflatten()
The
sym_posargument ofjax.scipy.linalg.solve()is deprecated in favor ofassume_a='pos', following a similar deprecation inscipy.linalg.solve().
jaxlib 0.3.15 (July 22, 2022)#
jax 0.3.14 (June 27, 2022)#
Breaking changes
jax.experimental.compilation_cache.initialize_cache()does not supportmax_cache_size_ bytesanymore and will not get that as an input.JAX_PLATFORMSnow raises an exception when platform initialization fails.
Changes
Fixed compatibility problems with NumPy 1.23.
jax.numpy.linalg.slogdet()now accepts an optionalmethodargument that allows selection between an LU-decomposition based implementation and an implementation based on QR decomposition.jax.numpy.linalg.qr()now supportsmode="raw".pickle,copy.copy, andcopy.deepcopynow have more complete support when used on jax arrays (#10659). In particular:pickleanddeepcopypreviously returnednp.ndarrayobjects when used on aDeviceArray; nowDeviceArrayobjects are returned. Fordeepcopy, the copied array is on the same device as the original. Forpicklethe deserialized array will be on the default device.Within function transformations (i.e. traced code),
deepcopyandcopypreviously were no-ops. Now they use the same mechanism asDeviceArray.copy().Calling
pickleon a traced array now results in an explicitConcretizationTypeError.
The implementation of singular value decomposition (SVD) and symmetric/Hermitian eigendecomposition should be significantly faster on TPU, especially for matrices above 1000x1000 or so. Both now use a spectral divide-and-conquer algorithm for eigendecomposition (QDWH-eig).
jax.numpy.ldexp()no longer silently promotes all inputs to float64, instead it promotes to float32 for integer inputs of size int32 or smaller (#10921).Add a
create_perfetto_linkoption tojax.profiler.start_trace()andjax.profiler.start_trace(). When used, the profiler will generate a link to the Perfetto UI to view the trace.Changed the semantics of
jax.profiler.start_server(...)()to store the keepalive globally, rather than requiring the user to keep a reference to it.Added
jax.random.ball().Added
jax.default_device().Added a
python -m jax.collect_profilescript to manually capture program traces as an alternative to the Tensorboard UI.Added a
jax.named_scopecontext manager that adds profiler metadata to Python programs (similar tojax.named_call).In scatter-update operations (i.e. :attr:
jax.numpy.ndarray.at), unsafe implicit dtype casts are deprecated, and now result in aFutureWarning. In a future release, this will become an error. An example of an unsafe implicit cast isjnp.zeros(4, dtype=int).at[0].set(1.5), in which1.5previously was silently truncated to1.jax.experimental.compilation_cache.initialize_cache()now supports gcs bucket path as input.Added
jax.scipy.stats.gennorm().jax.numpy.roots()is now better behaved whenstrip_zeros=Falsewhen coefficients have leading zeros (#11215).
jaxlib 0.3.14 (June 27, 2022)#
-
x86-64 Mac wheels now require Mac OS 10.14 (Mojave) or newer. Mac OS 10.14 was released in 2018, so this should not be a very onerous requirement.
The bundled version of NCCL was updated to 2.12.12, fixing some deadlocks.
The Python flatbuffers package is no longer a dependency of jaxlib.
jax 0.3.13 (May 16, 2022)#
jax 0.3.12 (May 15, 2022)#
Changes
Fixes #10717.
jax 0.3.11 (May 15, 2022)#
Changes
jax.lax.eigh()now accepts an optionalsort_eigenvaluesargument that allows users to opt out of eigenvalue sorting on TPU.
Deprecations
Non-array arguments to functions in
jax.lax.linalgare now marked keyword-only. As a backward-compatibility step passing keyword-only arguments positionally yields a warning, but in a future JAX release passing keyword-only arguments positionally will fail. However, most users should prefer to usejax.numpy.linalginstead.jax.scipy.linalg.polar_unitary(), which was a JAX extension to the scipy API, is deprecated. Usejax.scipy.linalg.polar()instead.
jax 0.3.10 (May 3, 2022)#
jaxlib 0.3.10 (May 3, 2022)#
Changes
TF commit fixes an issue in the MHLO canonicalizer that caused constant folding to take a long time or crash for certain programs.
jax 0.3.9 (May 2, 2022)#
Changes
Added support for fully asynchronous checkpointing for GlobalDeviceArray.
jax 0.3.8 (April 29 2022)#
Changes
jax.numpy.linalg.svd()on TPUs uses a qdwh-svd solver.jax.numpy.linalg.cond()on TPUs now accepts complex input.jax.numpy.linalg.pinv()on TPUs now accepts complex input.jax.numpy.linalg.matrix_rank()on TPUs now accepts complex input.jax.scipy.cluster.vq.vq()has been added.jax.experimental.maps.meshhas been deleted. Please usejax.experimental.maps.Mesh. Please see https://jax.readthedocs.io/en/latest/_autosummary/jax.experimental.maps.Mesh.html#jax.experimental.maps.Mesh for more information.jax.scipy.linalg.qr()now returns a length-1 tuple rather than the raw array whenmode='r', in order to match the behavior ofscipy.linalg.qr(#10452)jax.numpy.take_along_axis()now takes an optionalmodeparameter that specifies the behavior of out-of-bounds indexing. By default, invalid values (e.g., NaN) will be returned for out-of-bounds indices. In previous versions of JAX, invalid indices were clamped into range. The previous behavior can be restored by passingmode="clip".jax.numpy.take()now defaults tomode="fill", which returns invalid values (e.g., NaN) for out-of-bounds indices.Scatter operations, such as
x.at[...].set(...), now have"drop"semantics. This has no effect on the scatter operation itself, but it means that when differentiated the gradient of a scatter will yield zero cotangents for out-of-bounds indices. Previously out-of-bounds indices were clamped into range for the gradient, which was not mathematically correct.jax.numpy.take_along_axis()now raises aTypeErrorif its indices are not of an integer type, matching the behavior ofnumpy.take_along_axis(). Previously non-integer indices were silently cast to integers.jax.numpy.ravel_multi_index()now raises aTypeErrorif itsdimsargument is not of an integer type, matching the behavior ofnumpy.ravel_multi_index(). Previously non-integerdimswas silently cast to integers.jax.numpy.split()now raises aTypeErrorif itsaxisargument is not of an integer type, matching the behavior ofnumpy.split(). Previously non-integeraxiswas silently cast to integers.jax.numpy.indices()now raises aTypeErrorif its dimensions are not of an integer type, matching the behavior ofnumpy.indices(). Previously non-integer dimensions were silently cast to integers.jax.numpy.diag()now raises aTypeErrorif itskargument is not of an integer type, matching the behavior ofnumpy.diag(). Previously non-integerkwas silently cast to integers.Added
jax.random.orthogonal().
Deprecations
Many functions and objects available in
jax.test_utilare now deprecated and will raise a warning on import. This includescases_from_list,check_close,check_eq,device_under_test,format_shape_dtype_string,rand_uniform,skip_on_devices,with_config,xla_bridge, and_default_tolerance(#10389). These, along with previously-deprecatedJaxTestCase,JaxTestLoader, andBufferDonationTestCase, will be removed in a future JAX release. Most of these utilites can be replaced by calls to standard python & numpy testing utilities found in e.g.unittest,absl.testing,numpy.testing, etc. JAX-specific functionality such as device checking can be replaced through the use of public APIs such asjax.devices(). Many of the deprecated utilities will still exist injax._src.test_util, but these are not public APIs and as such may be changed or removed without notice in future releases.
jax 0.3.7 (April 15, 2022)#
Changes:
Fixed a performance problem if the indices passed to
jax.numpy.take_along_axis()were broadcasted (#10281).jax.scipy.special.expit()andjax.scipy.special.logit()now require their arguments to be scalars or JAX arrays. They also now promote integer arguments to floating point.The
DeviceArray.tile()method is deprecated, because numpy arrays do not have atile()method. As a replacement for this, usejax.numpy.tile()(#10266).
jaxlib 0.3.7 (April 15, 2022)#
Changes:
Linux wheels are now built conforming to the
manylinux2014standard, instead ofmanylinux2010.
jax 0.3.6 (April 12, 2022)#
jax 0.3.5 (April 7, 2022)#
Changes:
added
jax.random.loggamma()& improved behavior ofjax.random.beta()andjax.random.dirichlet()for small parameter values (#9906).the private
lax_numpysubmodule is no longer exposed in thejax.numpynamespace (#10029).added array creation routines
jax.numpy.frombuffer(),jax.numpy.fromfunction(), andjax.numpy.fromstring()(#10049).DeviceArray.copy()now returns aDeviceArrayrather than anp.ndarray(#10069)jax.experimental.sharded_jithas been deprecated and will be removed soon.
Deprecations:
jax.nn.normalize()is being deprecated. Usejax.nn.standardize()instead (#9899).jax.tree_util.tree_multimap()is deprecated. Usejax.tree_util.tree_map()instead (#5746).jax.experimental.sharded_jitis deprecated. Usepjitinstead.
jaxlib 0.3.5 (April 7, 2022)#
jax 0.3.4 (March 18, 2022)#
jax 0.3.3 (March 17, 2022)#
jax 0.3.2 (March 16, 2022)#
Changes:
The functions
jax.ops.index_update,jax.ops.index_add, which were deprecated in 0.2.22, have been removed. Please use the.atproperty on JAX arrays instead, e.g.,x.at[idx].set(y).Moved
jax.experimental.ann.approx_*_kintojax.lax. These functions are optimized alternatives tojax.lax.top_k.jax.numpy.broadcast_arrays()andjax.numpy.broadcast_to()now require scalar or array-like inputs, and will fail if they are passed lists (part of #7737).The standard jax[tpu] install can now be used with Cloud TPU v4 VMs.
pjitnow works on CPU (in addition to previous TPU and GPU support).
jaxlib 0.3.2 (March 16, 2022)#
Changes
XlaComputation.as_hlo_text()now supports printing large constants by passing boolean flagprint_large_constants=True.
Deprecations:
The
.block_host_until_ready()method on JAX arrays has been deprecated. Use.block_until_ready()instead.
jax 0.3.1 (Feb 18, 2022)#
Changes:
jax.test_util.JaxTestCaseandjax.test_util.JaxTestLoaderare now deprecated. The suggested replacement is to useparametrized.TestCasedirectly. For tests that rely on custom asserts such asJaxTestCase.assertAllClose(), the suggested replacement is to use standard numpy testing utilities such asnumpy.testing.assert_allclose(), which work directly with JAX arrays (#9620).jax.test_util.JaxTestCasenow setsjax_numpy_rank_promotion='raise'by default (#9562). To recover the previous behavior, use the newjax.test_util.with_configdecorator:@jtu.with_config(jax_numpy_rank_promotion='allow') class MyTestCase(jtu.JaxTestCase): ...
Added
jax.scipy.linalg.schur(),jax.scipy.linalg.sqrtm(),jax.scipy.signal.csd(),jax.scipy.signal.stft(),jax.scipy.signal.welch().
jax 0.3.0 (Feb 10, 2022)#
Changes
jax version has been bumped to 0.3.0. Please see the design doc for the explanation.
jaxlib 0.3.0 (Feb 10, 2022)#
Changes
Bazel 5.0.0 is now required to build jaxlib.
jaxlib version has been bumped to 0.3.0. Please see the design doc for the explanation.
jax 0.2.28 (Feb 1, 2022)#
-
jax.jit(f).lower(...).compiler_ir()now defaults to the MHLO dialect if nodialect=is passed.The
jax.jit(f).lower(...).compiler_ir(dialect='mhlo')now returns an MLIRir.Moduleobject instead of its string representation.
jaxlib 0.1.76 (Jan 27, 2022)#
New features
Includes precompiled SASS for NVidia compute capability 8.0 GPUS (e.g. A100). Removes precompiled SASS for compute capability 6.1 so as not to increase the number of compute capabilities: GPUs with compute capability 6.1 can use the 6.0 SASS.
With jaxlib 0.1.76, JAX uses the MHLO MLIR dialect as its primary target compiler IR by default.
Breaking changes
Support for NumPy 1.18 has been dropped, per the deprecation policy. Please upgrade to a supported NumPy version.
Bug fixes
Fixed a bug where apparently identical pytreedef objects constructed by different routes do not compare as equal (#9066).
The JAX jit cache requires two static arguments to have identical types for a cache hit (#9311).
jax 0.2.27 (Jan 18 2022)#
Breaking changes:
Support for NumPy 1.18 has been dropped, per the deprecation policy. Please upgrade to a supported NumPy version.
The host_callback primitives have been simplified to drop the special autodiff handling for hcb.id_tap and id_print. From now on, only the primals are tapped. The old behavior can be obtained (for a limited time) by setting the
JAX_HOST_CALLBACK_AD_TRANSFORMSenvironment variable, or the--flax_host_callback_ad_transformsflag. Additionally, added documentation for how to implement the old behavior using JAX custom AD APIs (#8678).Sorting now matches the behavior of NumPy for
0.0andNaNregardless of the bit representation. In particular,0.0and-0.0are now treated as equivalent, where previously-0.0was treated as less than0.0. Additionally allNaNrepresentations are now treated as equivalent and sorted to the end of the array. Previously negativeNaNvalues were sorted to the front of the array, andNaNvalues with different internal bit representations were not treated as equivalent, and were sorted according to those bit patterns (#9178).jax.numpy.unique()now treatsNaNvalues in the same way asnp.uniquein NumPy versions 1.21 and newer: at most oneNaNvalue will appear in the uniquified output (#9184).
Bug fixes:
host_callback now supports ad_checkpoint.checkpoint (#8907).
New features:
add
jax.block_until_ready({jax-issue}`#8941)Added a new debugging flag/environment variable
JAX_DUMP_IR_TO=/path. If set, JAX dumps the MHLO/HLO IR it generates for each computation to a file under the given path.Added
jax.ensure_compile_time_evalto the public api (#7987).jax2tf now supports a flag jax2tf_associative_scan_reductions to change the lowering for associative reductions, e.g., jnp.cumsum, to behave like JAX on CPU and GPU (to use an associative scan). See the jax2tf README for more details (#9189).
jaxlib 0.1.75 (Dec 8, 2021)#
New features:
Support for python 3.10.
jax 0.2.26 (Dec 8, 2021)#
Bug fixes:
Out-of-bounds indices to
jax.ops.segment_sumwill now be handled withFILL_OR_DROPsemantics, as documented. This primarily afects the reverse-mode derivative, where gradients corresponding to out-of-bounds indices will now be returned as 0. (#8634).jax2tf will force the converted code to use XLA for the code fragments under jax.jit, e.g., most jax.numpy functions (#7839).
jaxlib 0.1.74 (Nov 17, 2021)#
Enabled peer-to-peer copies between GPUs. Previously, GPU copies were bounced via the host, which is usually slower.
Added experimental MLIR Python bindings for use by JAX.
jax 0.2.25 (Nov 10, 2021)#
New features:
(Experimental)
jax.distributed.initializeexposes multi-host GPU backend.jax.random.permutationsupports newindependentkeyword argument (#8430)
Breaking changes
Moved
jax.experimental.staxtojax.example_libraries.staxMoved
jax.experimental.optimizerstojax.example_libraries.optimizers
New features:
Added
jax.lax.linalg.qdwh.
jax 0.2.24 (Oct 19, 2021)#
jaxlib 0.1.73 (Oct 18, 2021)#
Multiple cuDNN versions are now supported for jaxlib GPU
cuda11wheels.cuDNN 8.2 or newer. We recommend using the cuDNN 8.2 wheel if your cuDNN installation is new enough, since it supports additional functionality.
cuDNN 8.0.5 or newer.
Breaking changes:
The install commands for GPU jaxlib are as follows:
pip install --upgrade pip # Installs the wheel compatible with CUDA 11 and cuDNN 8.2 or newer. pip install --upgrade "jax[cuda]" -f https://storage.googleapis.com/jax-releases/jax_releases.html # Installs the wheel compatible with Cuda 11 and cudnn 8.2 or newer. pip install jax[cuda11_cudnn82] -f https://storage.googleapis.com/jax-releases/jax_releases.html # Installs the wheel compatible with Cuda 11 and cudnn 8.0.5 or newer. pip install jax[cuda11_cudnn805] -f https://storage.googleapis.com/jax-releases/jax_releases.html
jax 0.2.22 (Oct 12, 2021)#
Breaking Changes
Static arguments to
jax.pmapmust now be hashable.Unhashable static arguments have long been disallowed on
jax.jit, but they were still permitted onjax.pmap;jax.pmapcompared unhashable static arguments using object identity.This behavior is a footgun, since comparing arguments using object identity leads to recompilation each time the object identity changes. Instead, we now ban unhashable arguments: if a user of
jax.pmapwants to compare static arguments by object identity, they can define__hash__and__eq__methods on their objects that do that, or wrap their objects in an object that has those operations with object identity semantics. Another option is to usefunctools.partialto encapsulate the unhashable static arguments into the function object.jax.util.partialwas an accidental export that has now been removed. Usefunctools.partialfrom the Python standard library instead.
Deprecations
The functions
jax.ops.index_update,jax.ops.index_addetc. are deprecated and will be removed in a future JAX release. Please use the.atproperty on JAX arrays instead, e.g.,x.at[idx].set(y). For now, these functions produce aDeprecationWarning.
New features:
An optimized C++ code-path improving the dispatch time for
pmapis now the default when using jaxlib 0.1.72 or newer. The feature can be disabled using the--experimental_cpp_pmapflag (orJAX_CPP_PMAPenvironment variable).jax.numpy.uniquenow supports an optionalfill_valueargument (#8121)
jaxlib 0.1.72 (Oct 12, 2021)#
Breaking changes:
Support for CUDA 10.2 and CUDA 10.1 has been dropped. Jaxlib now supports CUDA 11.1+.
Bug fixes:
Fixes https://github.com/google/jax/issues/7461, which caused wrong outputs on all platforms due to incorrect buffer aliasing inside the XLA compiler.
jax 0.2.21 (Sept 23, 2021)#
Breaking Changes
jax.apihas been removed. Functions that were available asjax.api.*were aliases for functions injax.*; please use the functions injax.*instead.jax.partial, andjax.lax.partialwere accidental exports that have now been removed. Usefunctools.partialfrom the Python standard library instead.Boolean scalar indices now raise a
TypeError; previously this silently returned wrong results (#7925).Many more
jax.numpyfunctions now require array-like inputs, and will error if passed a list (#7747 #7802 #7907). See #7737 for a discussion of the rationale behind this change.When inside a transformation such as
jax.jit,jax.numpy.arrayalways stages the array it produces into the traced computation. Previouslyjax.numpy.arraywould sometimes produce a on-device array, even under ajax.jitdecorator. This change may break code that used JAX arrays to perform shape or index computations that must be known statically; the workaround is to perform such computations using classic NumPy arrays instead.jnp.ndarrayis now a true base-class for JAX arrays. In particular, this means that for a standard numpy arrayx,isinstance(x, jnp.ndarray)will now returnFalse(#7927).
New features:
Added
jax.numpy.insert()implementation (#7936).
jax 0.2.20 (Sept 2, 2021)#
Breaking Changes
jaxlib 0.1.71 (Sep 1, 2021)#
Breaking changes:
Support for CUDA 11.0 and CUDA 10.1 has been dropped. Jaxlib now supports CUDA 10.2 and CUDA 11.1+.
jax 0.2.19 (Aug 12, 2021)#
Breaking changes:
Support for NumPy 1.17 has been dropped, per the deprecation policy. Please upgrade to a supported NumPy version.
The
jitdecorator has been added around the implementation of a number of operators on JAX arrays. This speeds up dispatch times for common operators such as+.This change should largely be transparent to most users. However, there is one known behavioral change, which is that large integer constants may now produce an error when passed directly to a JAX operator (e.g.,
x + 2**40). The workaround is to cast the constant to an explicit type (e.g.,np.float64(2**40)).
New features:
Improved the support for shape polymorphism in jax2tf for operations that need to use a dimension size in array computation, e.g.,
jnp.mean. (#7317)
Bug fixes:
Some leaked trace errors from the previous release (#7613)
jaxlib 0.1.70 (Aug 9, 2021)#
Breaking changes:
Support for Python 3.6 has been dropped, per the deprecation policy. Please upgrade to a supported Python version.
Support for NumPy 1.17 has been dropped, per the deprecation policy. Please upgrade to a supported NumPy version.
The host_callback mechanism now uses one thread per local device for making the calls to the Python callbacks. Previously there was a single thread for all devices. This means that the callbacks may now be called interleaved. The callbacks corresponding to one device will still be called in sequence.
jax 0.2.18 (July 21 2021)#
Breaking changes:
Support for Python 3.6 has been dropped, per the deprecation policy. Please upgrade to a supported Python version.
The minimum jaxlib version is now 0.1.69.
The
backendargument tojax.dlpack.from_dlpack()has been removed.
New features:
Added a polar decomposition (
jax.scipy.linalg.polar()).
Bug fixes:
Tightened the checks for lax.argmin and lax.argmax to ensure they are not used with an invalid
axisvalue, or with an empty reduction dimension. (#7196)
jaxlib 0.1.69 (July 9 2021)#
Fix bugs in TFRT CPU backend that results in incorrect results.
jax 0.2.17 (July 9 2021)#
Bug fixes:
Default to the older “stream_executor” CPU runtime for jaxlib <= 0.1.68 to work around #7229, which caused wrong outputs on CPU due to a concurrency problem.
New features:
New SciPy function
jax.scipy.special.sph_harm().Reverse-mode autodiff functions (
jax.grad(),jax.value_and_grad(),jax.vjp(), andjax.linear_transpose()) support a parameter that indicates which named axes should be summed over in the backward pass if they were broadcasted over in the forward pass. This enables use of these APIs in a non-per-example way inside maps (initially onlyjax.experimental.maps.xmap()) (#6950).
jax 0.2.16 (June 23 2021)#
jax 0.2.15 (June 23 2021)#
New features:
#7042 Turned on TFRT CPU backend with significant dispatch performance improvements on CPU.
The
jax2tf.convert()supports inequalities and min/max for booleans (#6956).New SciPy function
jax.scipy.special.lpmn_values().
Breaking changes:
Support for NumPy 1.16 has been dropped, per the deprecation policy.
Bug fixes:
Fixed bug that prevented round-tripping from JAX to TF and back:
jax2tf.call_tf(jax2tf.convert)(#6947).
jaxlib 0.1.68 (June 23 2021)#
Bug fixes:
Fixed bug in TFRT CPU backend that gets nans when transfer TPU buffer to CPU.
jax 0.2.14 (June 10 2021)#
New features:
The
jax2tf.convert()now has support forpjitandsharded_jit.A new configuration option JAX_TRACEBACK_FILTERING controls how JAX filters tracebacks.
A new traceback filtering mode using
__tracebackhide__is now enabled by default in sufficiently recent versions of IPython.The
jax2tf.convert()supports shape polymorphism even when the unknown dimensions are used in arithmetic operations, e.g.,jnp.reshape(-1)(#6827).The
jax2tf.convert()generates custom attributes with location information in TF ops. The code that XLA generates after jax2tf has the same location information as JAX/XLA.New SciPy function
jax.scipy.special.lpmn().
Bug fixes:
The
jax2tf.convert()now ensures that it uses the same typing rules for Python scalars and for choosing 32-bit vs. 64-bit computations as JAX (#6883).The
jax2tf.convert()now scopes theenable_xlaconversion parameter properly to apply only during the just-in-time conversion (#6720).The
jax2tf.convert()now convertslax.dot_generalusing theXlaDotTensorFlow op, for better fidelity w.r.t. JAX numerical precision (#6717).The
jax2tf.convert()now has support for inequality comparisons and min/max for complex numbers (#6892).
jaxlib 0.1.67 (May 17 2021)#
jaxlib 0.1.66 (May 11 2021)#
New features:
CUDA 11.1 wheels are now supported on all CUDA 11 versions 11.1 or higher.
NVidia now promises compatibility between CUDA minor releases starting with CUDA 11.1. This means that JAX can release a single CUDA 11.1 wheel that is compatible with CUDA 11.2 and 11.3.
There is no longer a separate jaxlib release for CUDA 11.2 (or higher); use the CUDA 11.1 wheel for those versions (cuda111).
Jaxlib now bundles
libdevice.10.bcin CUDA wheels. There should be no need to point JAX to a CUDA installation to find this file.Added automatic support for static keyword arguments to the
jit()implementation.Added support for pretransformation exception traces.
Initial support for pruning unused arguments from
jit()-transformed computations. Pruning is still a work in progress.Improved the string representation of
PyTreeDefobjects.Added support for XLA’s variadic ReduceWindow.
Bug fixes:
Fixed a bug in the remote cloud TPU support when large numbers of arguments are passed to a computation.
Fix a bug that meant that JAX garbage collection was not triggered by
jit()transformed functions.
jax 0.2.13 (May 3 2021)#
New features:
When combined with jaxlib 0.1.66,
jax.jit()now supports static keyword arguments. A newstatic_argnamesoption has been added to specify keyword arguments as static.jax.nonzero()has a new optionalsizeargument that allows it to be used withinjit(#6501)jax.numpy.unique()now supports theaxisargument (#6532).jax.experimental.host_callback.call()now supportspjit.pjit(#6569).Added
jax.scipy.linalg.eigh_tridiagonal()that computes the eigenvalues of a tridiagonal matrix. Only eigenvalues are supported at present.The order of the filtered and unfiltered stack traces in exceptions has been changed. The traceback attached to an exception thrown from JAX-transformed code is now filtered, with an
UnfilteredStackTraceexception containing the original trace as the__cause__of the filtered exception. Filtered stack traces now also work with Python 3.6.If an exception is thrown by code that has been transformed by reverse-mode automatic differentiation, JAX now attempts to attach as a
__cause__of the exception aJaxStackTraceBeforeTransformationobject that contains the stack trace that created the original operation in the forward pass. Requires jaxlib 0.1.66.
Breaking changes:
The following function names have changed. There are still aliases, so this should not break existing code, but the aliases will eventually be removed so please change your code.
host_id–>process_index()host_count–>process_count()host_ids–>range(jax.process_count())
Similarly, the argument to
local_devices()has been renamed fromhost_idtoprocess_index.Arguments to
jax.jit()other than the function are now marked as keyword-only. This change is to prevent accidental breakage when arguments are added tojit.
Bug fixes:
jaxlib 0.1.65 (April 7 2021)#
jax 0.2.12 (April 1 2021)#
New features
New profiling APIs:
jax.profiler.start_trace(),jax.profiler.stop_trace(), andjax.profiler.trace()jax.lax.reduce()is now differentiable.
Breaking changes:
The minimum jaxlib version is now 0.1.64.
Some profiler APIs names have been changed. There are still aliases, so this should not break existing code, but the aliases will eventually be removed so please change your code.
TraceContext–>TraceAnnotation()StepTraceContext–>StepTraceAnnotation()trace_function–>annotate_function()
Omnistaging can no longer be disabled. See omnistaging for more information.
Python integers larger than the maximum
int64value will now lead to an overflow in all cases, rather than being silently converted touint64in some cases (#6047).Outside X64 mode, Python integers outside the range representable by
int32will now lead to anOverflowErrorrather than having their value silently truncated.
Bug fixes:
host_callbacknow supports empty arrays in arguments and results (#6262).jax.random.randint()clips rather than wraps of out-of-bounds limits, and can now generate integers in the full range of the specified dtype (#5868)
jax 0.2.11 (March 23 2021)#
New features:
Bug fixes:
#6136 generalized
jax.flatten_util.ravel_pytreeto handle integer dtypes.#6129 fixed a bug with handling some constants like
enum.IntEnums#6145 fixed batching issues with incomplete beta functions
#6014 fixed H2D transfers during tracing
#6165 avoids OverflowErrors when converting some large Python integers to floats
Breaking changes:
The minimum jaxlib version is now 0.1.62.
jaxlib 0.1.64 (March 18 2021)#
jaxlib 0.1.63 (March 17 2021)#
jax 0.2.10 (March 5 2021)#
New features:
jax.scipy.stats.chi2()is now available as a distribution with logpdf and pdf methods.jax.scipy.stats.betabinom()is now available as a distribution with logpmf and pmf methods.Added
jax.experimental.jax2tf.call_tf()to call TensorFlow functions from JAX (#5627) and README).Extended the batching rule for
lax.padto support batching of the padding values.
Bug fixes:
jax.numpy.take()properly handles negative indices (#5768)
Breaking changes:
JAX’s promotion rules were adjusted to make promotion more consistent and invariant to JIT. In particular, binary operations can now result in weakly-typed values when appropriate. The main user-visible effect of the change is that some operations result in outputs of different precision than before; for example the expression
jnp.bfloat16(1) + 0.1 * jnp.arange(10)previously returned afloat64array, and now returns abfloat16array. JAX’s type promotion behavior is described at Type promotion semantics.jax.numpy.linspace()now computes the floor of integer values, i.e., rounding towards -inf rather than 0. This change was made to match NumPy 1.20.0.jax.numpy.i0()no longer accepts complex numbers. Previously the function computed the absolute value of complex arguments. This change was made to match the semantics of NumPy 1.20.0.Several
jax.numpyfunctions no longer accept tuples or lists in place of array arguments:jax.numpy.pad(), :funcjax.numpy.ravel,jax.numpy.repeat(),jax.numpy.reshape(). In general,jax.numpyfunctions should be used with scalars or array arguments.
jaxlib 0.1.62 (March 9 2021)#
New features:
jaxlib wheels are now built to require AVX instructions on x86-64 machines by default. If you want to use JAX on a machine that doesn’t support AVX, you can build a jaxlib from source using the
--target_cpu_featuresflag tobuild.py.--target_cpu_featuresalso replaces--enable_march_native.
jaxlib 0.1.61 (February 12 2021)#
jaxlib 0.1.60 (Febuary 3 2021)#
Bug fixes:
Fixed a memory leak when converting CPU DeviceArrays to NumPy arrays. The memory leak was present in jaxlib releases 0.1.58 and 0.1.59.
bool,int8, anduint8are now considered safe to cast tobfloat16NumPy extension type.
jax 0.2.9 (January 26 2021)#
New features:
Extend the
jax.experimental.loopsmodule with support for pytrees. Improved error checking and error messages.Add
jax.experimental.enable_x64()andjax.experimental.disable_x64(). These are context managers which allow X64 mode to be temporarily enabled/disabled within a session.
Breaking changes:
jax.ops.segment_sum()now drops segment IDs that are out of range rather than wrapping them into the segment ID space. This was done for performance reasons.
jaxlib 0.1.59 (January 15 2021)#
jax 0.2.8 (January 12 2021)#
New features:
Add
jax.closure_convert()for use with higher-order custom derivative functions. (#5244)Add
jax.experimental.host_callback.call()to call a custom Python function on the host and return a result to the device computation. (#5243)
Bug fixes:
jax.numpy.arccoshnow returns the same branch asnumpy.arccoshfor complex inputs (#5156)host_callback.id_tapnow works forjax.pmapalso. There is an optional parameter forid_tapandid_printto request that the device from which the value is tapped be passed as a keyword argument to the tap function (#5182).
Breaking changes:
jax.numpy.padnow takes keyword arguments. Positional argumentconstant_valueshas been removed. In addition, passing unsupported keyword arguments raises an error.Changes for
jax.experimental.host_callback.id_tap()(#5243):Removed support for
kwargsforjax.experimental.host_callback.id_tap(). (This support has been deprecated for a few months.)Changed the printing of tuples for
jax.experimental.host_callback.id_print()to use ‘(’ instead of ‘[‘.Changed the
jax.experimental.host_callback.id_print()in presence of JVP to print a pair of primal and tangent. Previously, there were two separate print operations for the primals and the tangent.host_callback.outfeed_receiverhas been removed (it is not necessary, and was deprecated a few months ago).
New features:
New flag for debugging
inf, analagous to that forNaN(#5224).
jax 0.2.7 (Dec 4 2020)#
New features:
Add
jax.device_put_replicatedAdd multi-host support to
jax.experimental.sharded_jitAdd support for differentiating eigenvalues computed by
jax.numpy.linalg.eigAdd support for building on Windows platforms
Add support for general in_axes and out_axes in
jax.pmapAdd complex support for
jax.numpy.linalg.slogdet
Bug fixes:
Fix higher-than-second order derivatives of
jax.numpy.sincat zeroFix some hard-to-hit bugs around symbolic zeros in transpose rules
Breaking changes:
jax.experimental.optixhas been deleted, in favor of the standaloneoptaxPython package.indexing of JAX arrays with non-tuple sequences now raises a
TypeError. This type of indexing has been deprecated in Numpy since v1.16, and in JAX since v0.2.4. See #4564.
jax 0.2.6 (Nov 18 2020)#
New Features:
Add support for shape-polymorphic tracing for the jax.experimental.jax2tf converter. See README.md.
Breaking change cleanup
Raise an error on non-hashable static arguments for jax.jit and xla_computation. See cb48f42.
Improve consistency of type promotion behavior (#4744):
Adding a complex Python scalar to a JAX floating point number respects the precision of the JAX float. For example,
jnp.float32(1) + 1jnow returnscomplex64, where previously it returnedcomplex128.Results of type promotion with 3 or more terms involving uint64, a signed int, and a third type are now independent of the order of arguments. For example:
jnp.result_type(jnp.uint64, jnp.int64, jnp.float16)andjnp.result_type(jnp.float16, jnp.uint64, jnp.int64)both returnfloat16, where previously the first returnedfloat64and the second returnedfloat16.
The contents of the (undocumented)
jax.lax_linalglinear algebra module are now exposed publicly asjax.lax.linalg.jax.random.PRNGKeynow produces the same results in and out of JIT compilation (#4877). This required changing the result for a given seed in a few particular cases:With
jax_enable_x64=False, negative seeds passed as Python integers now return a different result outside JIT mode. For example,jax.random.PRNGKey(-1)previously returned[4294967295, 4294967295], and now returns[0, 4294967295]. This matches the behavior in JIT.Seeds outside the range representable by
int64outside JIT now result in anOverflowErrorrather than aTypeError. This matches the behavior in JIT.
To recover the keys returned previously for negative integers with
jax_enable_x64=Falseoutside JIT, you can use:key = random.PRNGKey(-1).at[0].set(0xFFFFFFFF)
DeviceArray now raises
RuntimeErrorinstead ofValueErrorwhen trying to access its value while it has been deleted.
jaxlib 0.1.58 (January 12ish 2021)#
Fixed a bug that meant JAX sometimes return platform-specific types (e.g.,
np.cint) instead of standard types (e.g.,np.int32). (#4903)Fixed a crash when constant-folding certain int16 operations. (#4971)
Added an
is_leafpredicate topytree.flatten().
jaxlib 0.1.57 (November 12 2020)#
Fixed manylinux2010 compliance issues in GPU wheels.
Switched the CPU FFT implementation from Eigen to PocketFFT.
Fixed a bug where the hash of bfloat16 values was not correctly initialized and could change (#4651).
Add support for retaining ownership when passing arrays to DLPack (#4636).
Fixed a bug for batched triangular solves with sizes greater than 128 but not a multiple of 128.
Fixed a bug when performing concurrent FFTs on multiple GPUs (#3518).
Fixed a bug in profiler where tools are missing (#4427).
Dropped support for CUDA 10.0.
jax 0.2.5 (October 27 2020)#
Improvements:
Ensure that
check_jaxprdoes not perform FLOPS. See #4650.Expanded the set of JAX primitives converted by jax2tf. See primitives_with_limited_support.md.
jax 0.2.4 (October 19 2020)#
jaxlib 0.1.56 (October 14, 2020)#
jax 0.2.3 (October 14 2020)#
The reason for another release so soon is we need to temporarily roll back a new jit fastpath while we look into a performance degradation
jax 0.2.2 (October 13 2020)#
jax 0.2.1 (October 6 2020)#
Improvements:
As a benefit of omnistaging, the host_callback functions are executed (in program order) even if the result of the
jax.experimental.host_callback.id_print()/jax.experimental.host_callback.id_tap()is not used in the computation.
jax (0.2.0) (September 23 2020)#
Improvements:
Omnistaging on by default. See #3370 and omnistaging
jax (0.1.77) (September 15 2020)#
Breaking changes:
New simplified interface for
jax.experimental.host_callback.id_tap()(#4101)
jaxlib 0.1.55 (September 8, 2020)#
Update XLA:
Fix bug in DLPackManagedTensorToBuffer (#4196)
jax 0.1.76 (September 8, 2020)#
jax 0.1.75 (July 30, 2020)#
Bug Fixes:
make jnp.abs() work for unsigned inputs (#3914)
Improvements:
“Omnistaging” behavior added behind a flag, disabled by default (#3370)
jax 0.1.74 (July 29, 2020)#
New Features:
BFGS (#3101)
TPU support for half-precision arithmetic (#3878)
Bug Fixes:
Prevent some accidental dtype warnings (#3874)
Fix a multi-threading bug in custom derivatives (#3845, #3869)
Improvements:
Faster searchsorted implementation (#3873)
Better test coverage for jax.numpy sorting algorithms (#3836)
jaxlib 0.1.52 (July 22, 2020)#
Update XLA.
jax 0.1.73 (July 22, 2020)#
The minimum jaxlib version is now 0.1.51.
New Features:
jax.image.resize. (#3703)
hfft and ihfft (#3664)
jax.numpy.intersect1d (#3726)
jax.numpy.lexsort (#3812)
lax.scanand thescanprimitive support anunrollparameter for loop unrolling when lowering to XLA (#3738).
Bug Fixes:
Fix reduction repeated axis error (#3618)
Fix shape rule for lax.pad for input dimensions of size 0. (#3608)
make psum transpose handle zero cotangents (#3653)
Fix shape error when taking JVP of reduce-prod over size 0 axis. (#3729)
Support differentiation through jax.lax.all_to_all (#3733)
address nan issue in jax.scipy.special.zeta (#3777)
Improvements:
Many improvements to jax2tf
Reimplement argmin/argmax using a single pass variadic reduction. (#3611)
Enable XLA SPMD partitioning by default. (#3151)
Add support for 0d transpose convolution (#3643)
Make LU gradient work for low-rank matrices (#3610)
support multiple_results and custom JVPs in jet (#3657)
Generalize reduce-window padding to support (lo, hi) pairs. (#3728)
Implement complex convolutions on CPU and GPU. (#3735)
Make jnp.take work for empty slices of empty arrays. (#3751)
Relax dimension ordering rules for dot_general. (#3778)
Enable buffer donation for GPU. (#3800)
Add support for base dilation and window dilation to reduce window op… (#3803)
jaxlib 0.1.51 (July 2, 2020)#
Update XLA.
Add new runtime support for host_callback.
jax 0.1.72 (June 28, 2020)#
Bug fixes:
Fix an odeint bug introduced in the previous release, see #3587.
jax 0.1.71 (June 25, 2020)#
The minimum jaxlib version is now 0.1.48.
Bug fixes:
Allow
jax.experimental.ode.odeintdynamics functions to close over values with respect to which we’re differentiating #3562.
jaxlib 0.1.50 (June 25, 2020)#
Add support for CUDA 11.0.
Drop support for CUDA 9.2 (we only maintain support for the last four CUDA versions.)
Update XLA.
jaxlib 0.1.49 (June 19, 2020)#
Bug fixes:
Fix build issue that could result in slow compiles (tensorflow/tensorflow)
jaxlib 0.1.48 (June 12, 2020)#
New features:
Adds support for fast traceback collection.
Adds preliminary support for on-device heap profiling.
Implements
np.nextafterforbfloat16types.Complex128 support for FFTs on CPU and GPU.
Bugfixes:
Improved float64
tanhaccuracy on GPU.float64 scatters on GPU are much faster.
Complex matrix multiplication on CPU should be much faster.
Stable sorts on CPU should actually be stable now.
Concurrency bug fix in CPU backend.
jax 0.1.70 (June 8, 2020)#
New features:
lax.switchintroduces indexed conditionals with multiple branches, together with a generalization of thecondprimitive #3318.
jax 0.1.69 (June 3, 2020)#
jax 0.1.68 (May 21, 2020)#
New features:
lax.cond()supports a single-operand form, taken as the argument to both branches #2993.
Notable changes:
The format of the
transformskeyword for thejax.experimental.host_callback.id_tap()primitive has changed #3132.
jax 0.1.67 (May 12, 2020)#
New features:
Support for reduction over subsets of a pmapped axis using
axis_index_groups#2382.Experimental support for printing and calling host-side Python function from compiled code. See id_print and id_tap (#3006).
Notable changes:
The visibility of names exported from
jax.numpyhas been tightened. This may break code that was making use of names that were previously exported accidentally.
jaxlib 0.1.47 (May 8, 2020)#
Fixes crash for outfeed.
jax 0.1.66 (May 5, 2020)#
New features:
Support for
in_axes=Noneonpmap()#2896.
jaxlib 0.1.46 (May 5, 2020)#
Fixes crash for linear algebra functions on Mac OS X (#432).
Fixes an illegal instruction crash caused by using AVX512 instructions when an operating system or hypervisor disabled them (#2906).
jax 0.1.65 (April 30, 2020)#
New features:
Differentiation of determinants of singular matrices #2809.
Bug fixes:
jaxlib 0.1.45 (April 21, 2020)#
Fixes segfault: #2755
Plumb is_stable option on Sort HLO through to Python.
jax 0.1.64 (April 21, 2020)#
New features:
Add syntactic sugar for functional indexed updates #2684.
Add
jax.numpy.unique()#2760.Add
jax.numpy.rint()#2724.Add
jax.numpy.rint()#2724.Add more primitive rules for
jax.experimental.jet().
Bug fixes:
Better errors:
Improves error message for reverse-mode differentiation of
lax.while_loop()#2129.
jaxlib 0.1.44 (April 16, 2020)#
Fixes a bug where if multiple GPUs of different models were present, JAX would only compile programs suitable for the first GPU.
Bugfix for
batch_group_countconvolutions.Added precompiled SASS for more GPU versions to avoid startup PTX compilation hang.
jax 0.1.63 (April 12, 2020)#
Added
jax.custom_jvpandjax.custom_vjpfrom #2026, see the tutorial notebook. Deprecatedjax.custom_transformsand removed it from the docs (though it still works).Add
scipy.sparse.linalg.cg#2566.Changed how Tracers are printed to show more useful information for debugging #2591.
Made
jax.numpy.isclosehandlenanandinfcorrectly #2501.Added several new rules for
jax.experimental.jet#2537.Fixed
jax.experimental.stax.BatchNormwhenscale/centerisn’t provided.Fix some missing cases of broadcasting in
jax.numpy.einsum#2512.Implement
jax.numpy.cumsumandjax.numpy.cumprodin terms of a parallel prefix scan #2596 and makereduce_proddifferentiable to arbitray order #2597.Add
batch_group_counttoconv_general_dilated#2635.Add docstring for
test_util.check_grads#2656.Add
callback_transform#2665.Implement
rollaxis,convolve/correlate1d & 2d,copysign,trunc,roots, andquantile/percentileinterpolation options.
jaxlib 0.1.43 (March 31, 2020)#
Fixed a performance regression for Resnet-50 on GPU.
jax 0.1.62 (March 21, 2020)#
JAX has dropped support for Python 3.5. Please upgrade to Python 3.6 or newer.
Removed the internal function
lax._safe_mul, which implemented the convention0. * nan == 0.. This change means some programs when differentiated will produce nans when they previously produced correct values, though it ensures nans rather than silently incorrect results are produced for other programs. See #2447 and #1052 for details.Added an
all_gatherparallel convenience function.More type annotations in core code.
jaxlib 0.1.42 (March 19, 2020)#
jaxlib 0.1.41 broke cloud TPU support due to an API incompatibility. This release fixes it again.
JAX has dropped support for Python 3.5. Please upgrade to Python 3.6 or newer.
jax 0.1.61 (March 17, 2020)#
Fixes Python 3.5 support. This will be the last JAX or jaxlib release that supports Python 3.5.
jax 0.1.60 (March 17, 2020)#
New features:
jax.pmap()hasstatic_broadcast_argnumsargument which allows the user to specify arguments that should be treated as compile-time constants and should be broadcasted to all devices. It works analogously tostatic_argnumsinjax.jit().Improved error messages for when tracers are mistakenly saved in global state.
Added
jax.nn.one_hot()utility function.Added
jax.experimental.jetfor exponentially faster higher-order automatic differentiation.Added more correctness checking to arguments of
jax.lax.broadcast_in_dim().
The minimum jaxlib version is now 0.1.41.
jaxlib 0.1.40 (March 4, 2020)#
Adds experimental support in Jaxlib for TensorFlow profiler, which allows tracing of CPU and GPU computations from TensorBoard.
Includes prototype support for multihost GPU computations that communicate via NCCL.
Improves performance of NCCL collectives on GPU.
Adds TopK, CustomCallWithoutLayout, CustomCallWithLayout, IGammaGradA and RandomGamma implementations.
Supports device assignments known at XLA compilation time.
jax 0.1.59 (February 11, 2020)#
Breaking changes
The minimum jaxlib version is now 0.1.38.
Simplified
Jaxprby removing theJaxpr.freevarsandJaxpr.bound_subjaxprs. The call primitives (xla_call,xla_pmap,sharded_call, andremat_call) get a new parametercall_jaxprwith a fully-closed (noconstvars) jaxpr. Also, added a new fieldcall_primitiveto primitives.
New features:
Reverse-mode automatic differentiation (e.g.
grad) oflax.cond, making it now differentiable in both modes (#2091)JAX now supports DLPack, which allows sharing CPU and GPU arrays in a zero-copy way with other libraries, such as PyTorch.
JAX GPU DeviceArrays now support
__cuda_array_interface__, which is another zero-copy protocol for sharing GPU arrays with other libraries such as CuPy and Numba.JAX CPU device buffers now implement the Python buffer protocol, which allows zero-copy buffer sharing between JAX and NumPy.
Added JAX_SKIP_SLOW_TESTS environment variable to skip tests known as slow.
jaxlib 0.1.39 (February 11, 2020)#
Updates XLA.
jaxlib 0.1.38 (January 29, 2020)#
CUDA 9.0 is no longer supported.
CUDA 10.2 wheels are now built by default.
jax 0.1.58 (January 28, 2020)#
Breaking changes
JAX has dropped Python 2 support, because Python 2 reached its end of life on January 1, 2020. Please update to Python 3.5 or newer.
New features
Forward-mode automatic differentiation (
jvp) of while loop (#1980)
New NumPy and SciPy functions:
Batched Cholesky decomposition on GPU now uses a more efficient batched kernel.
Notable bug fixes#
With the Python 3 upgrade, JAX no longer depends on
fastcache, which should help with installation.