dask 2023.8.0+dfsg-2 source package in Ubuntu

Changelog

dask (2023.8.0+dfsg-2) unstable; urgency=medium

  * Add force-little-endian-random.patch to try and initialize random
    number generator on s380x the same as on x86_64 (Closes: #1050526)
  * Depend on dask-sphinx-theme 3.0.5-2 to avoid accidentally including
    googletagmanager in the documentation.

 -- Diane Trout <email address hidden>  Fri, 25 Aug 2023 21:22:33 -0700

Upload details

Uploaded by:
Debian Python Team
Uploaded to:
Sid
Original maintainer:
Debian Python Team
Architectures:
all
Section:
misc
Urgency:
Medium Urgency

See full publishing history Publishing

Series Pocket Published Component Section
Noble release universe misc
Mantic release universe misc

Builds

Mantic: [FULLYBUILT] amd64

Downloads

File Size SHA-256 Checksum
dask_2023.8.0+dfsg-2.dsc 3.1 KiB c8bdbf6ae0877b94cf9d8de847dcd02aad96a0dee36080ded4921024a747135b
dask_2023.8.0+dfsg.orig.tar.xz 7.4 MiB d3051ddea3ea189f125227b8b302883b11ef32196c2f3d1ac36446d63be723fa
dask_2023.8.0+dfsg-2.debian.tar.xz 45.9 KiB 728208955351b365f963dd7f38e82ee93ca76effc05b49ff2706aaee484b6b54

No changes file available.

Binary packages built by this source

python-dask-doc: Minimal task scheduling abstraction documentation

 Dask is a flexible parallel computing library for analytics,
 containing two components.
 .
 1. Dynamic task scheduling optimized for computation. This is similar
 to Airflow, Luigi, Celery, or Make, but optimized for interactive
 computational workloads.
 2. "Big Data" collections like parallel arrays, dataframes, and lists
 that extend common interfaces like NumPy, Pandas, or Python iterators
 to larger-than-memory or distributed environments. These parallel
 collections run on top of the dynamic task schedulers.
 .
 This contains the documentation

python3-dask: Minimal task scheduling abstraction for Python 3

 Dask is a flexible parallel computing library for analytics,
 containing two components.
 .
 1. Dynamic task scheduling optimized for computation. This is similar
 to Airflow, Luigi, Celery, or Make, but optimized for interactive
 computational workloads.
 2. "Big Data" collections like parallel arrays, dataframes, and lists
 that extend common interfaces like NumPy, Pandas, or Python iterators
 to larger-than-memory or distributed environments. These parallel
 collections run on top of the dynamic task schedulers.
 .
 This contains the Python 3 version.