44 Commits

Author SHA1 Message Date
7f11c303fb chore: prepare v2.0.2 2024-06-22 19:33:56 -05:00
2e0be4ac62 fix: loop bug 2024-06-22 19:33:45 -05:00
0ba315ede0 chore: prepare v2.0.1 2024-06-22 19:12:56 -05:00
08bd9a817a fix: asyncpipe is profunctor 2024-06-22 19:12:31 -05:00
970d890a00 chore: prepare v2.0.0 2024-06-22 18:42:49 -05:00
5b3eda707e feat!: add AsyncPipe abstraction, significantly improve throughput of Transform streams 2024-06-22 18:42:22 -05:00
4cd44367a8 chore: prepare v1.6.1 2024-06-21 13:21:22 -05:00
d76f55e267 fix: introduced transform bug 2024-06-21 13:21:19 -05:00
4b91ab7d5c chore: prepare v1.6.0 2024-06-20 15:40:32 -05:00
a8702f4849 fix: finish may not emit until all chunks are read 2024-06-20 15:40:17 -05:00
f3ea830379 chore: prepare v1.5.0 2024-05-15 13:30:23 -05:00
4242330bef fix: write readme, clarify API 2024-05-15 13:30:08 -05:00
1b6e6423b1 chore: prepare v1.4.1 2024-05-14 15:02:39 -05:00
657af14bb6 fix: await finished before exiting 2024-05-14 14:07:20 -05:00
33d42034fc fix: continue to read after readableEnded 2024-05-14 13:51:57 -05:00
c8822aeffe fix: I WAS INCORRECT HHEHE 2024-05-14 13:44:33 -05:00
7076b13df4 fix: catch incorrect stream implementations 2024-05-14 13:42:39 -05:00
3f4bc12d36 chore: prepare v1.4.0 2024-05-14 13:12:56 -05:00
fd53b6520f feat: Collect.toBuffer 2024-05-14 13:12:41 -05:00
0ef7240d61 wip: explore removing delays(10) 2024-05-14 12:44:31 -05:00
de22f44f86 wip: explore removing delays(9) 2024-05-14 12:43:38 -05:00
f9c0e20777 wip: explore removing delays(8) 2024-05-14 11:08:46 -05:00
edc7d40dbc wip: explore removing delays(7) 2024-05-14 11:04:50 -05:00
ba8d90038d wip: explore removing delays(6) 2024-05-14 10:55:14 -05:00
dfdca9f5e9 wip: explore removing delays(5) 2024-05-14 10:39:06 -05:00
67ae171532 wip: explore removing delays(4) 2024-05-13 21:18:27 -05:00
a347c05062 wip: explore removing delays(3) 2024-05-13 21:17:28 -05:00
f9446c97a0 wip: explore removing delays(2) 2024-05-13 21:15:45 -05:00
d3b8d1792d wip: explore removing delays 2024-05-13 21:06:41 -05:00
e05c74f42f fix: minor fixes 2024-05-13 15:04:34 -05:00
e1c2481e70 chore: prepare v1.3.3 2024-05-13 14:42:48 -05:00
820351f800 fix: more yields 2024-05-13 14:42:23 -05:00
9d8b500b8d chore: prepare v1.3.2 2024-05-13 14:35:55 -05:00
b7bead090e fix: transform should read more than just 1 chunk after writing 2024-05-13 14:35:44 -05:00
3db5cc44a9 chore: prepare v1.3.1 2024-05-13 13:27:28 -05:00
1a5ca66e83 fix: Pipes.Node.FS.read' 2024-05-13 13:27:18 -05:00
54d9d57927 chore: prepare v1.3.0 2024-05-13 11:21:23 -05:00
a5c535fb1e feat: Pipes.Construct 2024-05-13 11:21:06 -05:00
7e6c6af3dd chore: prepare v1.2.3 2024-05-11 22:11:44 -05:00
faf49fafd5 chore: lock 2024-05-11 22:11:40 -05:00
04815f66a4 chore: prepare v1.2.2 2024-05-11 22:10:57 -05:00
fd895de148 fix: ensure-ranges 2024-05-11 22:09:45 -05:00
b618ef1819 chore: prepare v1.2.1 2024-05-11 22:08:38 -05:00
407491f055 fix: generalize all Affs to MonadAffs 2024-05-11 22:08:27 -05:00
22 changed files with 9261 additions and 288 deletions

139
README.md
View File

@@ -2,41 +2,114 @@
Interact with node streams in object mode using [`Pipes`]!
## Example
```purescript
import Prelude
import Effect.Aff (launchAff_)
import Effect (Effect)
import Effect.Class (liftEffect)
import Effect.Console (log)
import Pipes.Prelude ((>->))
import Pipes.Prelude as Pipes
import Pipes.Core as Pipes.Core
import Pipes.Node.FS.Stream as FS
import Pipes.Node.Zlib as Zlib
import Pipes.CSV.Parse as CSV.Parse
-- == my-zipped-data.csv ==
-- id,foo,is_deleted
-- 1,hello,f
-- 2,goodbye,t
-- Logs:
-- {id: 1, foo: "hello", is_deleted: false}
-- {id: 2, foo: "goodbye", is_deleted: true}
main :: Effect Unit
main =
Pipes.Core.runEffect
$ FS.createReadStream "my-zipped-data.csv.gz"
>-> Zlib.gunzip
>-> CSV.Parse.parse @{id :: Int, foo :: String, is_deleted :: Boolean}
>-> Pipes.mapM (liftEffect <<< log)
```
## Installing
## Install
```bash
spago install node-stream-pipes
```
## Usage
### Node Streams
#### Raw Streams
Raw `objectMode` Node streams are represented in `Node.Stream.Object`:
- `Writable a` accepts chunks of type `a`
- `Readable a` emits chunks of type `a`
- `Transform a b` transforms chunks from `a` to `b`
Non-Object streams can also be represented with these types; for example an `fs.WriteStream`
can be coerced to `Writable Buffer` without issue.
Interop between these types and `Node.Stream` are provided in `Node.Stream.Object`:
- `unsafeFrom{String,Buffer}{Writable,Readable,Transform}`
- `unsafeCoerce{Writable,Readable,Transform}`
#### Pipes
Streams in `Node.Stream.Object` can be converted to `Producer`s, `Consumer`s and `Pipe`s with `Pipes.Node.Stream`:
- `fromReadable :: forall a. <Readable a> -> Producer (Maybe a) <Aff> Unit`
- `fromWritable :: forall a. <Writable a> -> Consumer (Maybe a) <Aff> Unit`
- `fromTransform :: forall a b. <Transform a b> -> Pipe (Maybe a) (Maybe b) <Aff> Unit`
#### EOS Marker
Normally, pipe computations will not be executed once any computation in a pipeline exits.
To allow for resource cleanup and awareness that the stream is about to close,
`Maybe a` is used occasionally in this package as an End-of-Stream marker:
```purescript
-- foo.txt is "hello, world!\n"
chunks <- Pipes.Collect.toArray $ Pipes.FS.read "foo.txt" >-> Pipes.Node.Stream.inEOS (Pipes.Buffer.toString UTF8)
chunks `shouldEqual` [Just "hello, world!\n", Nothing]
```
Pipes from `a -> b` unaware of EOS can be lifted to `Maybe a -> Maybe b` with `Pipes.Node.Stream.inEOS`.
Producers of `Maybe a` can drop the EOS marker and emit `a` with `Pipes.Node.Stream.unEOS`.
Producers of `a` can have an EOS marker added with `Pipes.Node.Stream.withEOS`.
#### Example
`Pipes.PassThrough.js`
```javascript
import {PassThrough} from 'stream'
export const makePassThrough = () => new PassThrough()
```
`Pipes.PassThrough.purs`
```purescript
module Pipes.PassThrough where
import Prelude
import Effect (Effect)
import Effect.Class (liftEffect)
import Effect.Aff (Aff)
import Pipes.Core (Pipe)
import Node.Stream.Object as ObjectStream
import Pipes.Node.Stream as Pipes.Node.Stream
type PassThroughStream a = ObjectStream.Transform a a
foreign import makeRaw :: Effect PassThroughStream
passThrough :: forall a. Pipe a a Aff Unit
passThrough = do
raw <- liftEffect $ makeRaw
Pipes.Node.Stream.fromTransform raw
```
### Utilities
This package provides utilities that explicitly use `MonadRec` to ensure stack-safety
when dealing with producers of large amounts of data.
- `Pipes.Collect` provides stack-safe utilities for executing a pipeline and collecting results into a collection, `Buffer`, `Monoid` etc.
- `Pipes.Construct` provides stack-safe utilities for creating producers from in-memory collections.
- `Pipes.Util` provides some miscellaneous utilities missing from `pipes`.
### Zlib
Pipes for compression & decompression using `zlib` are provided in `Pipes.Node.Zlib`.
### FS
Read files with:
- `Pipes.Node.FS.read <path>`
- `Pipes.Node.FS.read' <WriteStreamOptions> <path>`
```purescript
Pipes.Collect.toStringWith UTF8 $ Pipes.Node.FS.read "foo.txt" >-> Pipes.Stream.unEOS
```
Write files with:
- `Pipes.Node.FS.write' <WriteStreamOptions> <path>`
- `Pipes.Node.FS.trunc <path>`
- `Pipes.Node.FS.create <path>`
- `Pipes.Node.FS.append <path>`
```purescript
Pipes.Stream.withEOS (
Pipes.Construct.eachArray ["id,name", "1,henry", "2,suzie"]
>-> Pipes.Util.intersperse "\n"
>-> Pipes.Buffer.fromString UTF8
)
>-> Pipes.Node.FS.create "foo.csv"
```
[`Pipes`]: https://pursuit.purescript.org/packages/purescript-pipes/8.0.0

BIN
bun.lockb

Binary file not shown.

View File

@@ -1,12 +1,13 @@
{
"name": "purescript-node-stream-pipes",
"version": "v1.2.0",
"version": "v2.0.2",
"type": "module",
"dependencies": {
"csv-parse": "^5.5.5",
"csv-stringify": "^6.4.6"
"csv-parse": "^5.5.6",
"csv-stringify": "^6.5.0"
},
"devDependencies": {
"cbor-x": "^1.5.9",
"typescript": "^5.4.5"
}
}

View File

@@ -5,12 +5,16 @@ workspace:
dependencies:
- aff: ">=7.1.0 <8.0.0"
- arrays: ">=7.3.0 <8.0.0"
- console: ">=6.1.0 <7.0.0"
- control: ">=6.0.0 <7.0.0"
- datetime: ">=6.1.0 <7.0.0"
- effect: ">=4.0.0 <5.0.0"
- either: ">=6.1.0 <7.0.0"
- exceptions: ">=6.0.0 <7.0.0"
- foldable-traversable: ">=6.0.0 <7.0.0"
- foreign-object
- lists
- foreign-object: ">=4.1.0 <5.0.0"
- fork: ">=6.0.0 <7.0.0"
- lists: ">=7.0.0 <8.0.0"
- maybe: ">=6.0.0 <7.0.0"
- mmorph: ">=7.0.0 <8.0.0"
- newtype: ">=5.0.0 <6.0.0"
@@ -20,16 +24,18 @@ workspace:
- node-path: ">=5.0.0 <6.0.0"
- node-streams: ">=9.0.0 <10.0.0"
- node-zlib: ">=0.4.0 <0.5.0"
- ordered-collections
- now: ">=6.0.0 <7.0.0"
- ordered-collections: ">=3.2.0 <4.0.0"
- parallel: ">=6.0.0 <7.0.0"
- pipes: ">=8.0.0 <9.0.0"
- prelude: ">=6.0.1 <7.0.0"
- profunctor: ">=6.0.1 <7.0.0"
- st: ">=6.2.0 <7.0.0"
- strings: ">=6.0.1 <7.0.0"
- tailrec: ">=6.1.0 <7.0.0"
- transformers: ">=6.0.0 <7.0.0"
- tuples
- unordered-collections
- tuples: ">=7.0.0 <8.0.0"
- unordered-collections: ">=3.1.0 <4.0.0"
- unsafe-coerce: ">=6.0.0 <7.0.0"
test_dependencies:
- console

View File

@@ -1,7 +1,7 @@
package:
name: node-stream-pipes
publish:
version: '1.2.0'
version: '2.0.2'
license: 'GPL-3.0-or-later'
location:
githubOwner: 'cakekindel'
@@ -10,17 +10,18 @@ package:
strict: true
pedanticPackages: true
dependencies:
- foreign-object
- lists
- ordered-collections
- tuples
- unordered-collections
- aff: ">=7.1.0 <8.0.0"
- arrays: ">=7.3.0 <8.0.0"
- console: ">=6.1.0 <7.0.0"
- control: ">=6.0.0 <7.0.0"
- datetime: ">=6.1.0 <7.0.0"
- effect: ">=4.0.0 <5.0.0"
- either: ">=6.1.0 <7.0.0"
- exceptions: ">=6.0.0 <7.0.0"
- foldable-traversable: ">=6.0.0 <7.0.0"
- foreign-object: ">=4.1.0 <5.0.0"
- fork: ">=6.0.0 <7.0.0"
- lists: ">=7.0.0 <8.0.0"
- maybe: ">=6.0.0 <7.0.0"
- mmorph: ">=7.0.0 <8.0.0"
- newtype: ">=5.0.0 <6.0.0"
@@ -30,16 +31,23 @@ package:
- node-path: ">=5.0.0 <6.0.0"
- node-streams: ">=9.0.0 <10.0.0"
- node-zlib: ">=0.4.0 <0.5.0"
- now: ">=6.0.0 <7.0.0"
- ordered-collections: ">=3.2.0 <4.0.0"
- parallel: ">=6.0.0 <7.0.0"
- pipes: ">=8.0.0 <9.0.0"
- prelude: ">=6.0.1 <7.0.0"
- profunctor: ">=6.0.1 <7.0.0"
- st: ">=6.2.0 <7.0.0"
- strings: ">=6.0.1 <7.0.0"
- tailrec: ">=6.1.0 <7.0.0"
- transformers: ">=6.0.0 <7.0.0"
- tuples: ">=7.0.0 <8.0.0"
- unordered-collections: ">=3.1.0 <4.0.0"
- unsafe-coerce: ">=6.0.0 <7.0.0"
test:
main: Test.Main
build:
strict: true
dependencies:
- console
- gen

View File

@@ -3,31 +3,36 @@ import Stream from "stream";
/** @type {(s: Stream.Readable | Stream.Transform) => () => boolean} */
export const isReadableImpl = (s) => () => s.readable;
/** @type {(s: Stream.Readable | Stream.Transform) => () => number} */
export const readableLengthImpl = (s) => () => s.readableLength;
/** @type {(s: Stream.Writable | Stream.Readable) => () => boolean} */
export const isClosedImpl = (s) => () => s.closed;
/** @type {(s: Stream.Writable | Stream.Transform) => () => boolean} */
export const isWritableImpl = (s) => () => s.writable;
/** @type {(s: Stream.Writable | Stream.Transform) => () => boolean} */
export const needsDrainImpl = (s) => () => s.writableNeedDrain;
/** @type {(s: Stream.Readable | Stream.Transform) => () => boolean} */
export const isReadableEndedImpl = (s) => () => s.readableEnded;
/** @type {(s: Stream.Writable | Stream.Transform) => () => boolean} */
export const isWritableEndedImpl = (s) => () => s.writableEnded;
/** @type {(s: Stream.Writable | Stream.Transform) => () => boolean} */
export const isWritableFinishedImpl = (s) => () => s.writableFinished;
/** @type {(s: Stream.Writable | Stream.Transform) => () => void} */
export const endImpl = (s) => () => s.end();
/** @type {<WriteResult>(o: {ok: WriteResult, wouldBlock: WriteResult, closed: WriteResult}) => (s: Stream.Writable | Stream.Transform) => (a: unknown) => () => WriteResult} */
/** @type {<WriteResult>(o: {ok: WriteResult, wouldBlock: WriteResult}) => (s: Stream.Writable | Stream.Transform) => (a: unknown) => () => WriteResult} */
export const writeImpl =
({ ok, wouldBlock, closed }) =>
({ ok, wouldBlock }) =>
(s) =>
(a) =>
() => {
if (s.closed || s.writableEnded) {
return closed;
}
if (s.write(a)) {
return ok;
} else {
@@ -35,15 +40,11 @@ export const writeImpl =
}
};
/** @type {<ReadResult>(o: {just: (_a: unknown) => ReadResult, wouldBlock: ReadResult, closed: ReadResult}) => (s: Stream.Readable | Stream.Transform) => () => ReadResult} */
/** @type {<ReadResult>(o: {just: (_a: unknown) => ReadResult, wouldBlock: ReadResult}) => (s: Stream.Readable | Stream.Transform) => () => ReadResult} */
export const readImpl =
({ just, closed, wouldBlock }) =>
({ just, wouldBlock }) =>
(s) =>
() => {
if (s.closed || s.readableEnded) {
return closed;
}
const a = s.read();
if (a === null) {
return wouldBlock;

View File

@@ -14,8 +14,9 @@ import Data.Maybe (Maybe(..))
import Data.Show.Generic (genericShow)
import Effect (Effect)
import Effect.Aff (Aff, effectCanceler, makeAff)
import Effect.Aff as Aff
import Effect.Class (liftEffect)
import Effect.Exception (Error)
import Effect.Exception (Error, error)
import Effect.Uncurried (mkEffectFn1)
import Node.Buffer (Buffer)
import Node.EventEmitter (EventHandle(..))
@@ -26,7 +27,6 @@ import Unsafe.Coerce (unsafeCoerce)
data ReadResult a
= ReadWouldBlock
| ReadClosed
| ReadJust a
derive instance Generic (ReadResult a) _
@@ -35,9 +35,12 @@ derive instance Eq a => Eq (ReadResult a)
instance Show (ReadResult a) where
show = genericShow <<< map (const "..")
maybeReadResult :: forall a. ReadResult a -> Maybe a
maybeReadResult (ReadWouldBlock) = Nothing
maybeReadResult (ReadJust a) = Just a
data WriteResult
= WriteWouldBlock
| WriteClosed
| WriteOk
derive instance Generic WriteResult _
@@ -45,8 +48,8 @@ derive instance Eq WriteResult
instance Show WriteResult where
show = genericShow
type ReadResultFFI a = { closed :: ReadResult a, wouldBlock :: ReadResult a, just :: a -> ReadResult a }
type WriteResultFFI = { closed :: WriteResult, wouldBlock :: WriteResult, ok :: WriteResult }
type ReadResultFFI a = { wouldBlock :: ReadResult a, just :: a -> ReadResult a }
type WriteResultFFI = { wouldBlock :: WriteResult, ok :: WriteResult }
foreign import data Writable :: Type -> Type
foreign import data Readable :: Type -> Type
@@ -59,13 +62,16 @@ foreign import isReadableImpl :: forall s. s -> Effect Boolean
foreign import isWritableImpl :: forall s. s -> Effect Boolean
foreign import isReadableEndedImpl :: forall s. s -> Effect Boolean
foreign import isWritableEndedImpl :: forall s. s -> Effect Boolean
foreign import isWritableFinishedImpl :: forall s. s -> Effect Boolean
foreign import isClosedImpl :: forall s. s -> Effect Boolean
foreign import needsDrainImpl :: forall s. s -> Effect Boolean
foreign import readableLengthImpl :: forall s. s -> Effect Int
readResultFFI :: forall a. ReadResultFFI a
readResultFFI = { closed: ReadClosed, wouldBlock: ReadWouldBlock, just: ReadJust }
readResultFFI = { wouldBlock: ReadWouldBlock, just: ReadJust }
writeResultFFI :: WriteResultFFI
writeResultFFI = { closed: WriteClosed, wouldBlock: WriteWouldBlock, ok: WriteOk }
writeResultFFI = { wouldBlock: WriteWouldBlock, ok: WriteOk }
class Stream :: Type -> Constraint
class Stream s where
@@ -81,25 +87,31 @@ else instance Stream s => Stream s where
isClosed s = isClosed s
class Stream s <= Read s a | s -> a where
readableLength :: s -> Effect Int
isReadable :: s -> Effect Boolean
isReadableEnded :: s -> Effect Boolean
read :: s -> Effect (ReadResult a)
class Stream s <= Write s a | s -> a where
isWritable :: s -> Effect Boolean
needsDrain :: s -> Effect Boolean
isWritableEnded :: s -> Effect Boolean
isWritableFinished :: s -> Effect Boolean
write :: s -> a -> Effect WriteResult
end :: s -> Effect Unit
instance Read (Readable a) a where
readableLength = readableLengthImpl
isReadable = isReadableImpl
isReadableEnded = isReadableEndedImpl
read = readImpl readResultFFI
else instance Read (Transform a b) b where
readableLength = readableLengthImpl
isReadable = isReadableImpl
isReadableEnded = isReadableEndedImpl
read = readImpl readResultFFI
else instance (Read s a) => Read s a where
readableLength = readableLengthImpl
isReadable = isReadableImpl
isReadableEnded = isReadableEndedImpl
read s = read s
@@ -107,18 +119,24 @@ else instance (Read s a) => Read s a where
instance Write (Writable a) a where
isWritable = isWritableImpl
isWritableEnded = isWritableEndedImpl
isWritableFinished = isWritableFinishedImpl
write s = writeImpl writeResultFFI s
end = endImpl
needsDrain = needsDrainImpl
else instance Write (Transform a b) a where
isWritable = isWritableImpl
isWritableEnded = isWritableEndedImpl
isWritableFinished = isWritableFinishedImpl
write s = writeImpl writeResultFFI s
end = endImpl
needsDrain = needsDrainImpl
else instance (Write s a) => Write s a where
isWritable = isWritableImpl
isWritableEnded = isWritableEndedImpl
isWritableFinished = isWritableFinishedImpl
write s a = write s a
end s = end s
needsDrain = needsDrainImpl
withErrorST :: forall s. Stream s => s -> Effect { cancel :: Effect Unit, error :: STRef Global (Maybe Error) }
withErrorST s = do
@@ -126,39 +144,74 @@ withErrorST s = do
cancel <- flip (Event.once errorH) s \e -> void $ liftST $ STRef.write (Just e) error
pure { error, cancel }
fromBufferReadable :: forall r. Stream.Readable r -> Readable Buffer
fromBufferReadable = unsafeCoerce
unsafeCoerceWritable :: forall r a. Stream.Writable r -> Writable a
unsafeCoerceWritable = unsafeCoerce
fromBufferTransform :: Stream.Duplex -> Transform Buffer Buffer
fromBufferTransform = unsafeCoerce
unsafeCoerceReadable :: forall r a. Stream.Readable r -> Readable a
unsafeCoerceReadable = unsafeCoerce
fromBufferWritable :: forall r. Stream.Writable r -> Writable Buffer
fromBufferWritable = unsafeCoerce
unsafeCoerceTransform :: forall a b. Stream.Duplex -> Transform a b
unsafeCoerceTransform = unsafeCoerce
fromStringReadable :: forall r. Stream.Readable r -> Readable String
fromStringReadable = unsafeCoerce
unsafeFromBufferReadable :: forall r. Stream.Readable r -> Readable Buffer
unsafeFromBufferReadable = unsafeCoerce
fromStringTransform :: Stream.Duplex -> Transform String String
fromStringTransform = unsafeCoerce
unsafeFromBufferTransform :: forall a. Stream.Duplex -> Transform Buffer a
unsafeFromBufferTransform = unsafeCoerce
fromStringWritable :: forall r. Stream.Writable r -> Writable String
fromStringWritable = unsafeCoerce
unsafeFromBufferWritable :: forall r. Stream.Writable r -> Writable Buffer
unsafeFromBufferWritable = unsafeCoerce
unsafeFromStringReadable :: forall r. Stream.Readable r -> Readable String
unsafeFromStringReadable = unsafeCoerce
unsafeFromStringTransform :: forall a. Stream.Duplex -> Transform String a
unsafeFromStringTransform = unsafeCoerce
unsafeFromStringWritable :: forall r. Stream.Writable r -> Writable String
unsafeFromStringWritable = unsafeCoerce
awaitReadableOrClosed :: forall s a. Read s a => s -> Aff Unit
awaitReadableOrClosed s = do
awaitReadableOrClosed s = Aff.supervise do
fiber <-
Aff.forkAff
$ parOneOf
[ onceAff0 readableH s $> Right unit
, onceAff0 closeH s $> Right unit
, Left <$> onceAff1 errorH s
]
closed <- liftEffect $ isClosed s
ended <- liftEffect $ isReadableEnded s
readEnded <- liftEffect $ isReadableEnded s
readable <- liftEffect $ isReadable s
when (not ended && not closed && not readable)
$ liftEither =<< parOneOf [ onceAff0 readableH s $> Right unit, onceAff0 closeH s $> Right unit, Left <$> onceAff1 errorH s ]
length <- liftEffect $ readableLength s
if (not closed && not readEnded && readable && length == 0) then
liftEither =<< Aff.joinFiber fiber
else
Aff.killFiber (error "") fiber
awaitFinished :: forall s a. Write s a => s -> Aff Unit
awaitFinished s = Aff.supervise do
fiber <- Aff.forkAff $ onceAff0 finishH s
finished <- liftEffect $ isWritableFinished s
if not finished then Aff.joinFiber fiber else Aff.killFiber (error "") fiber
awaitWritableOrClosed :: forall s a. Write s a => s -> Aff Unit
awaitWritableOrClosed s = do
awaitWritableOrClosed s = Aff.supervise do
fiber <-
Aff.forkAff
$ parOneOf
[ onceAff0 drainH s $> Right unit
, onceAff0 closeH s $> Right unit
, Left <$> onceAff1 errorH s
]
closed <- liftEffect $ isClosed s
ended <- liftEffect $ isWritableEnded s
writeEnded <- liftEffect $ isWritableEnded s
writable <- liftEffect $ isWritable s
when (not ended && not closed && not writable)
$ liftEither =<< parOneOf [ onceAff0 drainH s $> Right unit, onceAff0 closeH s $> Right unit, Left <$> onceAff1 errorH s ]
needsDrain <- liftEffect $ needsDrain s
if not closed && not writeEnded && writable && needsDrain then
liftEither =<< Aff.joinFiber fiber
else
Aff.killFiber (error "") fiber
onceAff0 :: forall e. EventHandle0 e -> e -> Aff Unit
onceAff0 h emitter = makeAff \res -> do
@@ -184,3 +237,6 @@ errorH = EventHandle "error" mkEffectFn1
endH :: forall s a. Write s a => EventHandle0 s
endH = EventHandle "end" identity
finishH :: forall s a. Write s a => EventHandle0 s
finishH = EventHandle "finish" identity

295
src/Pipes.Async.purs Normal file
View File

@@ -0,0 +1,295 @@
module Pipes.Async where
import Prelude hiding (join)
import Control.Alternative (class Alternative, empty, guard)
import Control.Monad.Error.Class (class MonadError, catchError, throwError)
import Control.Monad.Except (ExceptT, runExceptT)
import Control.Monad.Fork.Class (class MonadBracket, class MonadFork, fork)
import Control.Monad.Maybe.Trans (runMaybeT)
import Control.Monad.Morph (hoist)
import Control.Monad.Rec.Class (class MonadRec, Step(..), tailRecM)
import Control.Monad.ST.Class (liftST)
import Control.Monad.ST.Ref (STRef)
import Control.Monad.ST.Ref as ST.Ref
import Control.Monad.Trans.Class (lift)
import Control.Parallel (class Parallel, parOneOf)
import Data.Array (fold)
import Data.Array as Array
import Data.DateTime.Instant as Instant
import Data.Either (Either(..), either)
import Data.Generic.Rep (class Generic)
import Data.Maybe (Maybe(..), isNothing)
import Data.Newtype (unwrap)
import Data.Profunctor (class Profunctor)
import Data.Show.Generic (genericShow)
import Data.Time.Duration (Milliseconds)
import Data.Traversable (traverse_)
import Data.Tuple.Nested (type (/\), (/\))
import Effect.Aff.Class (class MonadAff)
import Effect.Class (liftEffect)
import Effect.Console (log)
import Effect.Exception (Error)
import Effect.Now as Now
import Pipes (await, yield)
import Pipes.Collect as Collect
import Pipes.Core (Pipe, Producer, Proxy)
data WriteSignal
= WriteSignalOk
| WriteSignalEnded
derive instance Generic WriteSignal _
derive instance Eq WriteSignal
derive instance Ord WriteSignal
instance Show WriteSignal where show = genericShow
instance Discard WriteSignal where
discard = bind
data ReadSignal
= ReadSignalOk
| ReadSignalEnded
derive instance Generic ReadSignal _
derive instance Eq ReadSignal
derive instance Ord ReadSignal
instance Show ReadSignal where show = genericShow
instance Discard ReadSignal where
discard = bind
data WriteResult
= WriteAgain
| WriteNeedsDrain
| WriteEnded
derive instance Generic WriteResult _
derive instance Eq WriteResult
derive instance Ord WriteResult
instance Show WriteResult where show = genericShow
data ReadResult a
= ReadOk a
| ReadWouldBlock
derive instance Generic (ReadResult a) _
derive instance Eq a => Eq (ReadResult a)
derive instance Ord a => Ord (ReadResult a)
derive instance Functor ReadResult
instance Show a => Show (ReadResult a) where show = genericShow
-- | An `AsyncPipe` is a `Pipe`-like struct that allows
-- | concurrently reading from a `Producer` and writing to a `Consumer`.
-- |
-- | An implementation of `AsyncPipe` for Node `Transform` streams
-- | is provided in `Pipes.Node.Stream`.
-- |
-- | ## Fields
-- | - `m x`
-- | - Initializer
-- | - `x -> a -> m WriteResult`
-- | - Write a value `a` to the underlying resource
-- | - `x -> m WriteSignal`
-- | - Block until the pipe is writable again (or writing must stop)
-- | - `x -> m (ReadResult b)`
-- | - Attempt to read a chunk
-- | - `x -> m ReadSignal`
-- | - Block until the pipe is readable again (or reading must stop)
data AsyncPipe x m a b =
AsyncPipe
(m x)
(x -> a -> m WriteResult)
(x -> m WriteSignal)
(x -> m (ReadResult b))
(x -> m ReadSignal)
instance Monad m => Functor (AsyncPipe x m a) where
map f (AsyncPipe init w aw r ar) = AsyncPipe init w aw (map (map f) <<< r) ar
instance Monad m => Profunctor (AsyncPipe x m) where
dimap :: forall a b c d. (a -> b) -> (c -> d) -> _ b c -> _ a d
dimap ab cd (AsyncPipe init w aw r ar) =
AsyncPipe
init
(\x -> w x <<< ab)
aw
(map (map cd) <<< r)
ar
-- | Wraps all fields of an `AsyncPipe` with logging to debug
-- | behavior and timing.
debug :: forall x a b m. MonadAff m => String -> AsyncPipe x m (Maybe a) (Maybe b) -> AsyncPipe x m (Maybe a) (Maybe b)
debug c (AsyncPipe init write awaitWrite read awaitRead) =
let
logL m = liftEffect $ log $ "[" <> c <> "] " <> m
logR m = liftEffect $ log $ "[" <> c <> "] " <> fold (Array.replicate 20 " ") <> m
time :: forall a'. m a' -> m (Milliseconds /\ a')
time m = do
start <- liftEffect Now.now
a <- m
end <- liftEffect Now.now
pure $ (end `Instant.diff` start) /\ a
init' = do
logL "init >"
elapsed /\ x <- time init
logL $ "< init " <> "(" <> show (unwrap elapsed) <> "ms)"
pure x
write' x a = do
logL "write >"
elapsed /\ w <- time $ write x a
logL $ "< write " <> show w <> " (" <> show (unwrap elapsed) <> "ms)"
pure w
read' x = do
logR "read >"
elapsed /\ r <- time $ read x
logR $ "< read " <> show (map (const unit) <$> r) <> " (" <> show (unwrap elapsed) <> "ms)"
pure r
awaitWrite' x = do
logL "awaitWrite >"
elapsed /\ w <- time $ awaitWrite x
logL $ "< awaitWrite " <> show w <> " (" <> show (unwrap elapsed) <> "ms)"
pure w
awaitRead' x = do
logR "awaitRead >"
elapsed /\ r <- time $ awaitRead x
logR $ "< awaitRead " <> show r <> " (" <> show (unwrap elapsed) <> "ms)"
pure r
in
AsyncPipe init' write' awaitWrite' read' awaitRead'
-- | Convert an `AsyncPipe` to a regular `Pipe`.
-- |
-- | Rather than two concurrently-running halves (producer & consumer),
-- | this requires the `AsyncPipe` to occasionally stop `await`ing data
-- | written by the upstream `Producer` so that it can `yield` to the downstream `Consumer`.
-- |
-- | This implementation chooses to prioritize `yield`ing data to the `Consumer` over
-- | `await`ing written chunks.
-- |
-- | Note that using this limits the potential parallelism of the entire pipeline, ex:
-- |
-- | ```purs
-- | Pipe.FS.read "foo.csv" -- read
-- | >-> sync Pipe.CSV.parse -- parse
-- | >-> sync Pipe.CBOR.encode -- encode
-- | >-> Pipe.FS.write "foo.bin" -- write
-- | ```
-- |
-- | In the above example, this is what happens when the pipeline
-- | is executed:
-- | 1. `write` asks `encode` "do you have any data yet?" (fast)
-- | 1. `encode` asks `parse` "do you have any data yet?" (fast)
-- | 1. `parse` asks `read` "do you have any data yet?" (fast)
-- | 1. `read` passes 1 chunk to `parse` (fast)
-- | 1. `parse` blocks until the chunk is parsed (slow)
-- | 1. `parse` passes 1 chunk to `encode` (fast)
-- | 1. `encode` blocks until the chunk is encoded (slow)
-- | 1. `write` writes the block (fast)
-- |
-- | For larger workloads, changing this to use `asyncPipe` would be preferable, ex:
-- | ```purs
-- | Pipe.FS.read "foo.csv" -- read
-- | >-/-> Pipe.CSV.parse -- parse
-- | >-/-> Pipe.CBOR.encode -- encode
-- | >-> Pipe.FS.write "foo.bin" -- write
-- | ```
-- |
-- | With this change:
-- | * `read` will pass chunks to `parse` as fast as `parse` allows
-- | * `parse` will parse chunks and yield them to `encode` as soon as they're ready
-- | * `encode` will encode chunks and yield them to `write` as soon as they're ready
sync :: forall x a b f p e m. MonadError e m => Alternative p => Parallel p m => MonadFork f m => MonadAff m => AsyncPipe x m (Maybe a) (Maybe b) -> Pipe (Maybe a) (Maybe b) m Unit
sync (AsyncPipe init write awaitWrite read awaitRead) =
let
liftPipe :: forall r. (Proxy _ _ _ _ m) r -> ExceptT (Step _ _) (Proxy _ _ _ _ m) r
liftPipe = lift
liftM :: forall r. m r -> ExceptT (Step _ _) (Proxy _ _ _ _ m) r
liftM = liftPipe <<< lift
continue a = throwError (Loop a)
break = throwError (Done unit)
awaitRW x = parOneOf [Right <$> awaitWrite x, Left <$> awaitRead x]
wSignal WriteSignalOk = WriteAgain
wSignal WriteSignalEnded = WriteEnded
in do
x <- lift init
flip tailRecM WriteAgain
\w ->
map (either identity identity)
$ runExceptT do
rb <- liftM $ read x
case rb of
ReadWouldBlock
| w == WriteEnded -> liftM (awaitRead x) *> continue w
| w == WriteNeedsDrain -> liftM (awaitRW x) >>= either (const $ continue w) (continue <<< wSignal)
| otherwise -> pure unit
ReadOk (Just b) -> liftPipe (yield $ Just b) *> continue w
ReadOk Nothing -> liftPipe (yield Nothing) *> break
when (w /= WriteAgain) $ continue w
a <- liftPipe await
w' <- liftM $ write x a
when (isNothing a) $ continue WriteEnded
pure $ Loop w'
-- | Implementation of `(>-/->)`
-- |
-- | In the current `MonadFork` "thread", read data from the `AsyncPipe` as it
-- | is yielded and `yield` to the downstream `Consumer`.
-- |
-- | Concurrently, in a separate thread, read data from the upstream `Producer`
-- | and write to the `AsyncPipe` at max throughput.
-- |
-- | If the producing half fails, the error is caught and rethrown.
-- |
-- | If the consuming half fails, the error is caught, the producing half is killed, and the error is rethrown.
pipeAsync
:: forall f m x a b
. MonadRec m
=> MonadAff m
=> MonadBracket Error f m
=> Producer (Maybe a) m Unit
-> AsyncPipe x m (Maybe a) (Maybe b)
-> Producer (Maybe b) m Unit
pipeAsync prod (AsyncPipe init write awaitWrite read awaitRead) =
do
errST :: STRef _ (Maybe Error) <- lift $ liftEffect $ liftST $ ST.Ref.new Nothing
killST :: STRef _ Boolean <- lift $ liftEffect $ liftST $ ST.Ref.new false
let
killThread = void $ liftEffect $ liftST $ ST.Ref.write true killST
threadKilled = liftEffect $ liftST $ ST.Ref.read killST
putThreadError = void <<< liftEffect <<< liftST <<< flip ST.Ref.write errST <<< Just
getThreadError = liftEffect $ liftST $ ST.Ref.read errST
rx x a = do
killed <- threadKilled
guard $ not killed
w <- lift $ write x a
case w of
WriteNeedsDrain -> lift $ void $ awaitWrite x
WriteEnded -> empty
WriteAgain -> pure unit
spawn = lift <<< fork <<< flip catchError putThreadError
x <- lift init
_thread <- spawn $ void $ runMaybeT $ Collect.foreach (rx x) (hoist lift prod)
flip tailRecM unit $ const do
getThreadError >>= traverse_ throwError
rb <- lift $ read x
case rb of
ReadOk (Just b) -> yield (Just b) $> Loop unit
ReadOk Nothing -> killThread *> yield Nothing $> Done unit
ReadWouldBlock -> void (lift (awaitRead x)) $> Loop unit
infixl 7 pipeAsync as >-/->

View File

@@ -1 +0,0 @@
module Pipes.CSV.Parse where

View File

@@ -2,12 +2,9 @@ module Pipes.Collect where
import Prelude
import Control.Monad.Maybe.Trans (MaybeT(..), runMaybeT)
import Control.Monad.Rec.Class (class MonadRec, Step(..), tailRecM)
import Control.Monad.ST.Class (liftST)
import Control.Monad.Trans.Class (lift)
import Data.Array.ST as Array.ST
import Data.Either (hush)
import Data.HashMap (HashMap)
import Data.HashMap as HashMap
import Data.Hashable (class Hashable)
@@ -15,45 +12,65 @@ import Data.List (List)
import Data.List as List
import Data.Map (Map)
import Data.Map as Map
import Data.Maybe (fromMaybe)
import Data.Maybe (Maybe(..), maybe)
import Data.Tuple.Nested (type (/\), (/\))
import Effect.Class (class MonadEffect, liftEffect)
import Foreign.Object (Object)
import Foreign.Object.ST as Object.ST
import Foreign.Object.ST.Unsafe as Object.ST.Unsafe
import Pipes (next) as Pipes
import Node.Buffer (Buffer)
import Node.Buffer as Buffer
import Node.Encoding (Encoding)
import Pipes.Core (Producer)
-- | Fold every value produced
-- |
-- | Uses `MonadRec`, supporting producers of arbitrary length.
fold :: forall a b m. MonadRec m => (b -> a -> b) -> b -> Producer a m Unit -> m b
fold f b p =
let
insertNext b' p' = runMaybeT do
a /\ p'' <- MaybeT $ hush <$> Pipes.next p'
pure $ Loop $ f b' a /\ p''
in
flip tailRecM (b /\ p) \(b' /\ p') -> fromMaybe (Done b') <$> insertNext b' p'
import Pipes.Internal (Proxy(..))
-- | Fold every value produced with a monadic action
-- |
-- | Uses `MonadRec`, supporting producers of arbitrary length.
traverse :: forall a b m. MonadRec m => (b -> a -> m b) -> b -> Producer a m Unit -> m b
traverse f b p =
let
insertNext b' p' = runMaybeT do
a /\ p'' <- MaybeT $ hush <$> Pipes.next p'
b'' <- lift $ f b' a
pure $ Loop $ b'' /\ p''
in
flip tailRecM (b /\ p) \(b' /\ p') -> fromMaybe (Done b') <$> insertNext b' p'
traverse f b0 p0 =
flip tailRecM (p0 /\ b0) \(p /\ b) ->
case p of
Respond a m -> Loop <$> (m unit /\ _) <$> f b a
M m -> Loop <$> (_ /\ b) <$> m
Request _ _ -> pure $ Done b
Pure _ -> pure $ Done b
-- | Fold every value produced
-- |
-- | Uses `MonadRec`, supporting producers of arbitrary length.
fold :: forall a b m. MonadRec m => (b -> a -> b) -> b -> Producer a m Unit -> m b
fold f b0 p0 = traverse (\b a -> pure $ f b a) b0 p0
-- | Execute a monadic action on every item in a producer.
-- |
-- | Uses `MonadRec`, supporting producers of arbitrary length.
foreach :: forall a m. MonadRec m => (a -> m Unit) -> Producer a m Unit -> m Unit
foreach f = traverse (const f) unit
foreach f p0 = traverse (\_ a -> f a) unit p0
-- | `append` all emitted values to `mempty`
toMonoid :: forall a m. Monoid a => MonadRec m => MonadEffect m => Producer a m Unit -> m a
toMonoid = fold (<>) mempty
-- | Concatenate all buffers to a single buffer, then decode with the
-- | provided encoding.
toStringWith :: forall m. MonadRec m => MonadEffect m => Encoding -> Producer Buffer m Unit -> m String
toStringWith enc = (liftEffect <<< Buffer.toString enc) <=< toBuffer
-- | Concatenate all produced buffers to a single buffer
toBuffer :: forall m. MonadRec m => MonadEffect m => Producer Buffer m Unit -> m Buffer
toBuffer p =
(liftEffect <<< maybe (Buffer.alloc 0) pure)
=<< traverse
( flip \b ->
case _ of
Just acc -> do
new <- liftEffect $ Buffer.concat [ acc, b ]
pure $ Just new
_ -> pure $ Just b
)
Nothing
p
-- | Collect all values from a `Producer` into an array.
toArray :: forall a m. MonadRec m => MonadEffect m => Producer a m Unit -> m (Array a)
@@ -63,9 +80,18 @@ toArray p = do
liftEffect $ liftST $ Array.ST.unsafeFreeze st
-- | Collect all values from a `Producer` into a list.
-- |
-- | Reverses the list after collecting, so that values will be
-- | in the order they were emitted.
toList :: forall a m. MonadRec m => MonadEffect m => Producer a m Unit -> m (List a)
toList = map List.reverse <<< fold (flip List.Cons) List.Nil
-- | Collect all values from a `Producer` into a list.
-- |
-- | Does not reverse the list after collecting.
toListRev :: forall a m. MonadRec m => MonadEffect m => Producer a m Unit -> m (List a)
toListRev = map List.reverse <<< fold (flip List.Cons) List.Nil
-- | Collect all values from a `Producer` into a Javascript Object.
toObject :: forall a m. MonadRec m => MonadEffect m => Producer (String /\ a) m Unit -> m (Object a)
toObject p = do

View File

@@ -1 +1,64 @@
module Pipes.Construct where
import Prelude
import Control.Monad.Maybe.Trans (MaybeT(..), runMaybeT)
import Control.Monad.Rec.Class (class MonadRec, Step(..), tailRecM)
import Control.Monad.ST.Class (liftST)
import Control.Monad.Trans.Class (lift)
import Data.Array as Array
import Data.Array.ST as Array.ST
import Data.List (List)
import Data.List as List
import Data.Map (Map)
import Data.Map.Internal as Map.Internal
import Data.Maybe (fromMaybe)
import Data.Tuple.Nested (type (/\), (/\))
import Effect.Class (class MonadEffect, liftEffect)
import Pipes (yield, (>->))
import Pipes.Core (Producer)
import Pipes.Prelude as Pipe
import Pipes.Util as Pipe.Util
-- Producer that will emit monotonically increasing integers
-- ex `monotonic 0 -> 0 1 2 3 4 5 6 7 ..`
monotonic :: forall m. MonadRec m => Int -> Producer Int m Unit
monotonic start = flip tailRecM start \n -> yield n $> Loop (n + 1)
-- Producer that will emit integers from `start` (inclusive) to `end` (exclusive)
range :: forall m. MonadRec m => Int -> Int -> Producer Int m Unit
range start end = monotonic start >-> Pipe.take end
-- | Stack-safe producer that yields every value in an Array
eachArray :: forall a m. MonadRec m => Array a -> Producer a m Unit
eachArray as = monotonic 0 >-> Pipe.map (Array.index as) >-> Pipe.Util.whileJust
-- | Stack-safe producer that yields every value in a List
eachList :: forall a m. MonadRec m => List a -> Producer a m Unit
eachList init =
flip tailRecM init \as -> fromMaybe (Done unit) <$> runMaybeT do
head <- MaybeT $ pure $ List.head as
tail <- MaybeT $ pure $ List.tail as
lift $ yield head
pure $ Loop tail
-- | Stack-safe producer that yields every value in a Map
eachMap :: forall k v m. MonadEffect m => MonadRec m => Map k v -> Producer (k /\ v) m Unit
eachMap init = do
stack <- liftEffect $ liftST $ Array.ST.new
let
push a = void $ liftEffect $ liftST $ Array.ST.push a stack
pop = liftEffect $ liftST $ Array.ST.pop stack
flip tailRecM init case _ of
Map.Internal.Leaf -> fromMaybe (Done unit) <$> runMaybeT do
a <- MaybeT pop
pure $ Loop a
Map.Internal.Node _ _ k v Map.Internal.Leaf Map.Internal.Leaf -> do
yield $ k /\ v
pure $ Loop Map.Internal.Leaf
Map.Internal.Node _ _ k v Map.Internal.Leaf r -> do
yield $ k /\ v
pure $ Loop r
Map.Internal.Node a b k v l r -> do
push $ Map.Internal.Node a b k v Map.Internal.Leaf r
pure $ Loop l

View File

@@ -2,11 +2,13 @@ module Pipes.Node.FS where
import Prelude
import Control.Monad.Error.Class (class MonadThrow)
import Data.Maybe (Maybe)
import Effect.Aff (Aff)
import Effect.Aff.Class (class MonadAff)
import Effect.Class (liftEffect)
import Effect.Exception (Error)
import Node.Buffer (Buffer)
import Node.FS.Stream (WriteStreamOptions)
import Node.FS.Stream (WriteStreamOptions, ReadStreamOptions)
import Node.FS.Stream as FS.Stream
import Node.Path (FilePath)
import Node.Stream.Object as O
@@ -21,39 +23,57 @@ import Prim.Row (class Union)
-- |
-- | See `Pipes.Node.Stream.withEOS` for converting `Producer a`
-- | into `Producer (Maybe a)`, emitting `Nothing` before exiting.
write
:: forall r trash
write'
:: forall r trash m
. Union r trash WriteStreamOptions
=> MonadAff m
=> MonadThrow Error m
=> Record r
-> FilePath
-> Consumer (Maybe Buffer) Aff Unit
write o p = do
-> Consumer (Maybe Buffer) m Unit
write' o p = do
w <- liftEffect $ FS.Stream.createWriteStream' p o
fromWritable $ O.fromBufferWritable w
fromWritable $ O.unsafeCoerceWritable w
-- | Open a file in write mode, failing if the file already exists.
-- |
-- | `write {flags: "wx"}`
create :: FilePath -> Consumer (Maybe Buffer) Aff Unit
create = write { flags: "wx" }
-- | `write' {flags: "wx"}`
create :: forall m. MonadAff m => MonadThrow Error m => FilePath -> Consumer (Maybe Buffer) m Unit
create = write' { flags: "wx" }
-- | Open a file in write mode, truncating it if the file already exists.
-- |
-- | `write {flags: "w"}`
truncate :: FilePath -> Consumer (Maybe Buffer) Aff Unit
truncate = write { flags: "w" }
-- | `write' {flags: "w"}`
trunc :: forall m. MonadAff m => MonadThrow Error m => FilePath -> Consumer (Maybe Buffer) m Unit
trunc = write' { flags: "w" }
-- | Open a file in write mode, appending written contents if the file already exists.
-- |
-- | `write {flags: "a"}`
append :: FilePath -> Consumer (Maybe Buffer) Aff Unit
append = write { flags: "a" }
-- | `write' {flags: "a"}`
append :: forall m. MonadAff m => MonadThrow Error m => FilePath -> Consumer (Maybe Buffer) m Unit
append = write' { flags: "a" }
-- | Creates a `fs.Readable` stream for the file at the given path.
-- |
-- | Emits `Nothing` before closing. To opt out of this behavior,
-- | use `Pipes.Node.Stream.withoutEOS` or `Pipes.Node.Stream.unEOS`.
read :: FilePath -> Producer (Maybe Buffer) Aff Unit
read :: forall m. MonadAff m => MonadThrow Error m => FilePath -> Producer (Maybe Buffer) m Unit
read p = do
r <- liftEffect $ FS.Stream.createReadStream p
fromReadable $ O.fromBufferReadable r
fromReadable $ O.unsafeCoerceReadable r
-- | Creates a `fs.Readable` stream for the file at the given path.
-- |
-- | Emits `Nothing` before closing. To opt out of this behavior,
-- | use `Pipes.Node.Stream.withoutEOS` or `Pipes.Node.Stream.unEOS`.
read'
:: forall r trash m
. Union r trash ReadStreamOptions
=> MonadAff m
=> MonadThrow Error m
=> Record r
-> FilePath
-> Producer (Maybe Buffer) m Unit
read' opts p = do
r <- liftEffect $ FS.Stream.createReadStream' p opts
fromReadable $ O.unsafeCoerceReadable r

View File

@@ -1,31 +1,36 @@
module Pipes.Node.Stream where
import Prelude
import Prelude hiding (join)
import Control.Monad.Error.Class (throwError)
import Control.Monad.Error.Class (class MonadThrow, throwError)
import Control.Monad.Rec.Class (class MonadRec, Step(..), tailRecM)
import Control.Monad.ST.Class (liftST)
import Control.Monad.ST.Global (Global)
import Control.Monad.ST.Ref (STRef)
import Control.Monad.ST.Ref as ST.Ref
import Control.Monad.ST.Ref as STRef
import Control.Monad.Trans.Class (lift)
import Data.Maybe (Maybe(..))
import Data.Newtype (wrap)
import Data.Traversable (for_)
import Data.Maybe (Maybe(..), maybe)
import Data.Traversable (for_, traverse_)
import Data.Tuple.Nested ((/\))
import Effect.Aff (Aff, delay)
import Effect.Aff.Class (liftAff)
import Effect.Class (liftEffect)
import Effect (Effect)
import Effect.Aff.Class (class MonadAff, liftAff)
import Effect.Class (class MonadEffect, liftEffect)
import Effect.Exception (Error)
import Node.Stream.Object (WriteResult(..), maybeReadResult)
import Node.Stream.Object as O
import Pipes (await, yield)
import Pipes (for) as P
import Pipes.Async (AsyncPipe(..))
import Pipes.Async as AsyncPipe
import Pipes.Core (Consumer, Pipe, Producer, Producer_)
import Pipes.Prelude (mapFoldable) as P
import Pipes.Util (InvokeResult(..), invoke)
-- | Convert a `Readable` stream to a `Pipe`.
-- |
-- | This will yield `Nothing` before exiting, signaling
-- | End-of-stream.
fromReadable :: forall s a. O.Read s a => s -> Producer_ (Maybe a) Aff Unit
fromReadable :: forall s a m. MonadThrow Error m => MonadAff m => O.Read s a => s -> Producer_ (Maybe a) m Unit
fromReadable r =
let
cleanup rmErrorListener = do
@@ -33,15 +38,19 @@ fromReadable r =
pure $ Done unit
go { error, cancel } = do
liftAff $ delay $ wrap 0.0
err <- liftEffect $ liftST $ STRef.read error
for_ err throwError
res <- liftEffect $ O.read r
case res of
O.ReadJust a -> yield (Just a) $> Loop { error, cancel }
O.ReadWouldBlock -> lift (O.awaitReadableOrClosed r) $> Loop { error, cancel }
O.ReadClosed -> yield Nothing *> cleanup cancel
O.ReadWouldBlock -> do
ended <- liftEffect $ O.isReadableEnded r
if ended then do
yield Nothing
cleanup cancel
else
liftAff (O.awaitReadableOrClosed r) $> Loop { error, cancel }
in
do
e <- liftEffect $ O.withErrorST r
@@ -51,86 +60,119 @@ fromReadable r =
-- |
-- | When `Nothing` is piped to this, the stream will
-- | be `end`ed, and the pipe will noop if invoked again.
fromWritable :: forall s a. O.Write s a => s -> Consumer (Maybe a) Aff Unit
fromWritable w =
fromWritable :: forall s a m. MonadThrow Error m => MonadAff m => O.Write s a => s -> Consumer (Maybe a) m Unit
fromWritable w = do
{ error: errorST, cancel: removeErrorListener } <- liftEffect $ O.withErrorST w
let
cleanup rmErrorListener = do
liftEffect rmErrorListener
liftEffect $ O.end w
pure $ Done unit
maybeThrow = traverse_ throwError =<< liftEffect (liftST $ STRef.read errorST)
go { error, cancel } = do
liftAff $ delay $ wrap 0.0
err <- liftEffect $ liftST $ STRef.read error
for_ err throwError
waitCanWrite = do
shouldWait <- liftEffect $ O.needsDrain w
when shouldWait $ liftAff $ O.awaitWritableOrClosed w
ma <- await
case ma of
Nothing -> cleanup cancel
Just a -> do
res <- liftEffect $ O.write w a
case res of
O.WriteOk -> pure $ Loop { error, cancel }
O.WriteWouldBlock -> do
liftAff (O.awaitWritableOrClosed w)
pure $ Loop { error, cancel }
O.WriteClosed -> cleanup cancel
in
do
r <- liftEffect $ O.withErrorST w
tailRecM go r
cleanup = do
liftAff $ O.awaitFinished w
maybeThrow
liftEffect removeErrorListener
-- | Convert a `Transform` stream to a `Pipe`.
-- |
-- | When `Nothing` is piped to this, the `Transform` stream will
-- | be `end`ed, and the pipe will noop if invoked again.
fromTransform :: forall a b. O.Transform a b -> Pipe (Maybe a) (Maybe b) Aff Unit
onEOS = liftEffect (O.end w) *> cleanup $> Done unit
onChunk a = liftEffect (O.write w a) $> Loop unit
go _ = do
maybeThrow
waitCanWrite
ended <- liftEffect $ O.isWritableEnded w
if ended then
cleanup $> Done unit
else
await >>= maybe onEOS onChunk
tailRecM go unit
newtype TransformContext a b =
TransformContext
{ stream :: O.Transform a b
, removeErrorListener :: Effect Unit
, errorST :: STRef Global (Maybe Error)
}
transformCleanup :: forall m a b. MonadEffect m => TransformContext a b -> m Unit
transformCleanup (TransformContext {removeErrorListener}) = do
liftEffect removeErrorListener
transformStream :: forall a b. TransformContext a b -> O.Transform a b
transformStream (TransformContext {stream}) = stream
transformRethrow :: forall m a b. MonadThrow Error m => MonadEffect m => TransformContext a b -> m Unit
transformRethrow (TransformContext {errorST}) = traverse_ throwError =<< liftEffect (liftST $ ST.Ref.read errorST)
-- | Convert a `Transform` stream to an `AsyncPipe`.
fromTransform
:: forall a b m
. MonadThrow Error m
=> MonadAff m
=> Effect (O.Transform a b)
-> AsyncPipe (TransformContext a b) m (Maybe a) (Maybe b)
fromTransform t =
let
cleanup removeErrorListener = do
liftEffect $ O.end t
liftEffect $ removeErrorListener
fromReadable t
pure $ Done unit
yieldFromReadableHalf = do
res <- liftEffect (O.read t)
case res of
O.ReadJust a -> yield (Just a)
O.ReadWouldBlock -> pure unit
O.ReadClosed -> yield Nothing *> pure unit
go { error, cancel } = do
liftAff $ delay $ wrap 0.0
err <- liftEffect $ liftST $ STRef.read error
for_ err throwError
ma <- await
case ma of
Nothing -> cleanup cancel
Just a' -> do
res <- liftEffect $ O.write t a'
yieldFromReadableHalf
case res of
O.WriteClosed -> cleanup cancel
O.WriteOk -> pure $ Loop { error, cancel }
O.WriteWouldBlock -> do
lift (O.awaitWritableOrClosed t)
pure $ Loop { error, cancel }
init = do
stream <- liftEffect t
{ error: errorST, cancel: removeErrorListener } <- liftEffect $ O.withErrorST stream
pure $ TransformContext {errorST, removeErrorListener, stream}
write x Nothing = do
let s = transformStream x
liftEffect $ O.end s
pure AsyncPipe.WriteEnded
write x (Just a) = do
transformRethrow x
let s = transformStream x
w <- liftEffect $ O.write s a
pure $ case w of
WriteOk -> AsyncPipe.WriteAgain
WriteWouldBlock -> AsyncPipe.WriteNeedsDrain
awaitWrite x = do
transformRethrow x
let s = transformStream x
liftAff $ O.awaitWritableOrClosed s
ended <- liftEffect $ O.isWritableEnded s
if ended then
pure $ AsyncPipe.WriteSignalEnded
else do
liftAff $ O.awaitWritableOrClosed s
pure $ AsyncPipe.WriteSignalOk
read x =
do
transformRethrow x
let s = transformStream x
readEnded <- liftEffect $ O.isReadableEnded s
if readEnded then do
pure $ AsyncPipe.ReadOk Nothing
else
maybe AsyncPipe.ReadWouldBlock (AsyncPipe.ReadOk <<< Just) <$> maybeReadResult <$> liftEffect (O.read s)
awaitRead x = do
transformRethrow x
let s = transformStream x
ended <- liftEffect $ O.isReadableEnded s
if ended then
pure $ AsyncPipe.ReadSignalEnded
else do
liftAff $ O.awaitReadableOrClosed s
pure $ AsyncPipe.ReadSignalOk
in
do
r <- liftEffect $ O.withErrorST t
tailRecM go r
AsyncPipe init write awaitWrite read awaitRead
-- | Given a `Producer` of values, wrap them in `Just`.
-- |
-- | Before the `Producer` exits, emits `Nothing` as an End-of-stream signal.
withEOS :: forall a. Producer a Aff Unit -> Producer (Maybe a) Aff Unit
withEOS :: forall a m. Monad m => Producer a m Unit -> Producer (Maybe a) m Unit
withEOS a = do
P.for a (yield <<< Just)
yield Nothing
-- | Strip a pipeline of the EOS signal
unEOS :: forall a. Pipe (Maybe a) a Aff Unit
unEOS = P.mapFoldable identity
unEOS :: forall a m. Monad m => Pipe (Maybe a) a m Unit
unEOS = tailRecM (const $ maybe (pure $ Done unit) (\a -> yield a $> Loop unit) =<< await) unit
-- | Lift a `Pipe a a` to `Pipe (Maybe a) (Maybe a)`.
-- |

View File

@@ -2,39 +2,44 @@ module Pipes.Node.Zlib where
import Prelude
import Control.Monad.Error.Class (class MonadThrow)
import Data.Maybe (Maybe)
import Effect (Effect)
import Effect.Aff (Aff)
import Effect.Aff.Class (class MonadAff)
import Effect.Class (liftEffect)
import Effect.Exception (Error)
import Node.Buffer (Buffer)
import Node.Stream.Object as O
import Node.Zlib as Zlib
import Node.Zlib.Types (ZlibStream)
import Pipes.Core (Pipe)
import Pipes.Node.Stream (fromTransform)
import Pipes.Async (AsyncPipe)
import Pipes.Node.Stream (TransformContext, fromTransform)
fromZlib :: forall r. Effect (ZlibStream r) -> Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
fromZlib z = do
raw <- liftEffect $ Zlib.toDuplex <$> z
fromTransform $ O.fromBufferTransform raw
type X = TransformContext Buffer Buffer
gzip :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
fromZlib :: forall r m. MonadAff m => MonadThrow Error m => Effect (ZlibStream r) -> AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
fromZlib z =
fromTransform do
raw <- liftEffect $ Zlib.toDuplex <$> z
pure $ O.unsafeCoerceTransform raw
gzip :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
gzip = fromZlib Zlib.createGzip
gunzip :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
gunzip :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
gunzip = fromZlib Zlib.createGunzip
unzip :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
unzip :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
unzip = fromZlib Zlib.createUnzip
inflate :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
inflate :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
inflate = fromZlib Zlib.createInflate
deflate :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
deflate :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
deflate = fromZlib Zlib.createDeflate
brotliCompress :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
brotliCompress :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
brotliCompress = fromZlib Zlib.createBrotliCompress
brotliDecompress :: Pipe (Maybe Buffer) (Maybe Buffer) Aff Unit
brotliDecompress :: forall m. MonadAff m => MonadThrow Error m => AsyncPipe X m (Maybe Buffer) (Maybe Buffer)
brotliDecompress = fromZlib Zlib.createBrotliDecompress

View File

@@ -3,23 +3,35 @@ module Pipes.Util where
import Prelude
import Control.Monad.Maybe.Trans (MaybeT(..), runMaybeT)
import Control.Monad.Rec.Class (class MonadRec, forever, whileJust)
import Control.Monad.Rec.Class (class MonadRec, Step(..), forever, tailRecM)
import Control.Monad.Rec.Class as Rec
import Control.Monad.ST.Class (liftST)
import Control.Monad.ST.Ref (STRef)
import Control.Monad.ST.Ref as STRef
import Control.Monad.Trans.Class (lift)
import Data.Array.ST (STArray)
import Data.Array.ST as Array.ST
import Data.Either (hush)
import Data.HashSet as HashSet
import Data.Hashable (class Hashable, hash)
import Data.List.NonEmpty (NonEmptyList)
import Data.Maybe (Maybe(..))
import Data.Maybe (Maybe(..), fromMaybe)
import Data.Tuple.Nested (type (/\), (/\))
import Effect.Class (class MonadEffect, liftEffect)
import Pipes (await, yield)
import Pipes.Core (Pipe)
import Pipes as Pipes
import Pipes.Core (Pipe, Producer)
import Pipes.Internal (Proxy(..))
-- | Re-yield all `Just`s, and close when `Nothing` is encountered
whileJust :: forall m a. MonadRec m => Pipe (Maybe a) a m Unit
whileJust = do
first <- await
flip tailRecM first $ \ma -> fromMaybe (Done unit) <$> runMaybeT do
a <- MaybeT $ pure ma
lift $ yield a
lift $ Loop <$> await
-- | Yields a separator value `sep` between received values
-- |
-- | ```purescript
@@ -33,7 +45,7 @@ intersperse sep = do
getIsFirst = liftEffect $ liftST $ STRef.read isFirstST
markNotFirst = void $ liftEffect $ liftST $ STRef.write false isFirstST
whileJust $ runMaybeT do
Rec.whileJust $ runMaybeT do
a <- MaybeT await
isFirst <- getIsFirst
if isFirst then markNotFirst else lift $ yield $ Just sep
@@ -41,6 +53,16 @@ intersperse sep = do
yield Nothing
-- Pair every emitted value from 2 producers together, exiting when either exits.
zip :: forall a b m. MonadRec m => Producer a m Unit -> Producer b m Unit -> Producer (a /\ b) m Unit
zip as bs =
flip tailRecM (as /\ bs) \(as' /\ bs') ->
fromMaybe (Done unit) <$> runMaybeT do
a /\ as'' <- MaybeT $ lift $ hush <$> Pipes.next as'
b /\ bs'' <- MaybeT $ lift $ hush <$> Pipes.next bs'
lift $ yield $ a /\ b
pure $ Loop $ as'' /\ bs''
-- | Accumulate values in chunks of a given size.
-- |
-- | If the pipe closes without yielding a multiple of `size` elements,
@@ -60,12 +82,18 @@ chunked size = do
void $ flip STRef.write chunkST =<< Array.ST.new
Array.ST.unsafeFreeze chunkArray
whileJust $ runMaybeT do
Rec.whileJust $ runMaybeT do
a <- MaybeT await
chunkPut a
len <- chunkLength
when (len >= size) $ lift $ yield =<< Just <$> chunkTake
yield =<< Just <$> chunkTake
len <- lift chunkLength
when (len >= size) do
chunk <- lift chunkTake
lift $ yield $ Just chunk
len <- chunkLength
when (len > 0) do
chunk <- chunkTake
yield $ Just chunk
yield Nothing
-- | Equivalent of unix `uniq`, filtering out duplicate values passed to it.
@@ -121,3 +149,5 @@ invoke m a =
Pure _ -> pure Exited
in
go (IDidNotAwait m)

View File

@@ -3,18 +3,21 @@ module Test.Main where
import Prelude
import Data.Maybe (Maybe(..))
import Data.Time.Duration (Milliseconds(..))
import Effect (Effect)
import Effect.Aff (launchAff_)
import Test.Pipes.Node.Stream as Test.Pipes.Node.Stream
import Test.Pipes.Collect as Test.Pipes.Collect
import Test.Pipes.Construct as Test.Pipes.Construct
import Test.Pipes.Node.Buffer as Test.Pipes.Node.Buffer
import Test.Pipes.Node.FS as Test.Pipes.Node.FS
import Test.Pipes.Collect as Test.Pipes.Collect
import Test.Pipes.Node.Stream as Test.Pipes.Node.Stream
import Test.Spec.Reporter (specReporter)
import Test.Spec.Runner (defaultConfig, runSpec')
main :: Effect Unit
main = launchAff_ $ runSpec' (defaultConfig { failFast = true, timeout = Nothing }) [ specReporter ] do
main = launchAff_ $ runSpec' (defaultConfig { slow = Milliseconds 0.0, failFast = true, exit = false, timeout = Nothing }) [ specReporter ] do
Test.Pipes.Node.Stream.spec
Test.Pipes.Node.Buffer.spec
Test.Pipes.Node.FS.spec
Test.Pipes.Collect.spec
Test.Pipes.Construct.spec

View File

@@ -0,0 +1,58 @@
module Test.Pipes.Construct where
import Prelude
import Data.Array as Array
import Data.List as List
import Data.Map as Map
import Data.Tuple.Nested (type (/\), (/\))
import Effect.Class (liftEffect)
import Pipes.Collect as Pipes.Collect
import Pipes.Construct as Pipes.Construct
import Test.Spec (Spec, describe, it)
import Test.Spec.Assertions (shouldEqual)
spec :: Spec Unit
spec =
describe "Test.Pipes.Construct" do
describe "eachMap" do
it "empty map" do
kvs <- Pipes.Collect.toArray $ Pipes.Construct.eachMap Map.empty
kvs `shouldEqual` ([] :: Array (Int /\ Int))
it "nonempty map" do
let
exp = (\n -> n /\ n) <$> Array.range 0 99999
map = Map.fromFoldable exp
kvs <-
liftEffect
$ Pipes.Collect.toArray
$ Pipes.Construct.eachMap
$ map
kvs `shouldEqual` exp
describe "eachArray" do
it "empty array" do
kvs <- Pipes.Collect.toArray $ Pipes.Construct.eachArray []
kvs `shouldEqual` ([] :: Array Int)
it "nonempty array" do
let
inp = (\n -> n /\ n) <$> Array.range 0 99999
kvs <-
liftEffect
$ Pipes.Collect.toArray
$ Pipes.Construct.eachArray
$ inp
kvs `shouldEqual` inp
describe "eachList" do
it "empty list" do
kvs <- Pipes.Collect.toArray $ Pipes.Construct.eachList List.Nil
kvs `shouldEqual` ([] :: Array Int)
it "nonempty list" do
let
inp = (\n -> n /\ n) <$> Array.range 0 99999
kvs <-
liftEffect
$ Pipes.Collect.toArray
$ Pipes.Construct.eachList
$ List.fromFoldable
$ inp
kvs `shouldEqual` inp

View File

@@ -33,18 +33,13 @@ spec = describe "Pipes.Node.FS" do
liftEffect $ FS.writeTextFile UTF8 p "foo"
s <- fold <$> Pipes.toListM (Pipes.Node.FS.read p >-> unEOS >-> Pipes.Node.Buffer.toString UTF8)
s `shouldEqual` "foo"
around tmpFile $ it "fails if the file already exists" \p -> do
liftEffect $ FS.writeTextFile UTF8 "foo" p
flip catchError (const $ pure unit) do
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.create p
fail "should have thrown"
describe "create" do
around tmpFile $ it "creates the file when not exists" \p -> do
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.create p
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` "foo"
around tmpFile $ it "fails if the file already exists" \p -> do
liftEffect $ FS.writeTextFile UTF8 "foo" p
liftEffect $ FS.writeTextFile UTF8 p "foo"
flip catchError (const $ pure unit) do
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.create p
fail "should have thrown"
@@ -59,14 +54,14 @@ spec = describe "Pipes.Node.FS" do
Pipes.runEffect $ withEOS (yield "bar" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.append p
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` "foo\nbar"
describe "truncate" do
describe "trunc" do
around tmpFile $ it "creates the file when not exists" \p -> do
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.truncate p
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.trunc p
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` "foo"
around tmpFile $ it "overwrites contents" \p -> do
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.truncate p
Pipes.runEffect $ withEOS (yield "bar" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.truncate p
Pipes.runEffect $ withEOS (yield "foo" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.trunc p
Pipes.runEffect $ withEOS (yield "bar" >-> Pipes.Node.Buffer.fromString UTF8) >-> Pipes.Node.FS.trunc p
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` "bar"
around tmpFiles $ it "json lines >-> parse >-> _.foo >-> write" \(a /\ b) -> do

View File

@@ -1,4 +1,16 @@
import Stream from 'stream'
import * as CBOR from "cbor-x";
import * as CSVDecode from "csv-parse";
import * as CSVEncode from "csv-stringify";
export const cborDecode = () => new CBOR.DecoderStream({useRecords: false, allowHalfOpen: true});
export const cborEncode = () => new CBOR.EncoderStream({useRecords: false, allowHalfOpen: true});
export const cborDecodeSync = a => () => CBOR.decodeMultiple(a);
export const cborEncodeSync = a => () => CBOR.encode(a, {useRecords: false});
export const csvDecode = () => CSVDecode.parse({columns: true, allowHalfOpen: true})
export const csvEncode = () => CSVEncode.stringify({header: true, allowHalfOpen: true})
export const discardTransform = () => new Stream.Transform({
transform: function(_ck, _enc, cb) {
@@ -7,6 +19,16 @@ export const discardTransform = () => new Stream.Transform({
objectMode: true
})
export const slowTransform = () => {
return new Stream.Transform({
transform: function(ck, _enc, cb) {
this.push(ck)
setTimeout(() => cb(), 4)
},
objectMode: true
})
}
export const charsTransform = () => new Stream.Transform({
transform: function(ck, _enc, cb) {
ck.split('').filter(s => !!s).forEach(s => this.push(s))

View File

@@ -2,49 +2,72 @@ module Test.Pipes.Node.Stream where
import Prelude
import Control.Monad.Trans.Class (lift)
import Control.Monad.Cont (lift)
import Control.Monad.Error.Class (liftEither)
import Control.Monad.Except (runExcept)
import Data.Array as Array
import Data.Foldable (fold)
import Data.Bifunctor (lmap)
import Data.Either (Either(..))
import Data.Foldable (fold, intercalate)
import Data.FoldableWithIndex (forWithIndex_)
import Data.FunctorWithIndex (mapWithIndex)
import Data.Int as Int
import Data.List ((:))
import Data.List as List
import Data.Maybe (Maybe)
import Data.Newtype (wrap)
import Data.Profunctor.Strong (first)
import Data.String as String
import Data.String.Gen (genAlphaString)
import Data.Traversable (for_, traverse)
import Data.Tuple.Nested (type (/\), (/\))
import Effect (Effect)
import Effect.Aff (Aff, delay)
import Effect.Class (class MonadEffect, liftEffect)
import Effect.Exception (error)
import Effect.Unsafe (unsafePerformEffect)
import Foreign (Foreign)
import Node.Buffer (Buffer)
import Node.Buffer as Buffer
import Node.Encoding (Encoding(..))
import Node.FS.Stream as FS.Stream
import Node.FS.Sync as FS
import Node.Stream.Object as O
import Node.Zlib as Zlib
import Pipes (each) as Pipes
import Pipes (each) as Pipe
import Pipes (yield, (>->))
import Pipes.Async (sync, (>-/->))
import Pipes.Collect as Pipe.Collect
import Pipes.Core (Consumer, Producer, runEffect)
import Pipes.Node.Buffer as Pipes.Buffer
import Pipes.Node.Stream as S
import Pipes.Prelude (mapFoldable, toListM) as Pipes
import Simple.JSON (writeJSON)
import Test.Common (jsonParse, jsonStringify, tmpFile, tmpFiles)
import Pipes.Node.Buffer as Pipe.Buffer
import Pipes.Node.FS as Pipe.FS
import Pipes.Node.Stream as Pipe.Node
import Pipes.Node.Zlib as Pipe.Zlib
import Pipes.Prelude (toListM) as Pipe
import Simple.JSON (readImpl, readJSON, writeJSON)
import Test.Common (jsonStringify, tmpFile, tmpFiles)
import Test.QuickCheck.Arbitrary (arbitrary)
import Test.QuickCheck.Gen (randomSample')
import Test.Spec (Spec, around, describe, it)
import Test.Spec.Assertions (shouldEqual)
import Test.Spec.Assertions (fail, shouldEqual)
foreign import readableFromArray :: forall @a. Array a -> O.Readable a
foreign import discardTransform :: forall a b. Effect (O.Transform a b)
foreign import slowTransform :: forall a b. Effect (O.Transform a b)
foreign import charsTransform :: Effect (O.Transform String String)
foreign import cborEncodeSync :: forall a. a -> Effect Buffer
foreign import cborDecodeSync :: forall a. Buffer -> Effect a
foreign import cborEncode :: forall a. Effect (O.Transform a Buffer)
foreign import cborDecode :: forall a. Effect (O.Transform Buffer a)
foreign import csvEncode :: forall a. Effect (O.Transform a String)
foreign import csvDecode :: forall a. Effect (O.Transform String a)
writer :: forall m. MonadEffect m => String -> m (O.Writable Buffer /\ Consumer (Maybe Buffer) Aff Unit)
writer a = do
stream <- liftEffect $ O.fromBufferWritable <$> FS.Stream.createWriteStream a
pure $ stream /\ S.fromWritable stream
stream <- liftEffect $ O.unsafeCoerceWritable <$> FS.Stream.createWriteStream a
pure $ stream /\ Pipe.Node.fromWritable stream
reader :: forall m. MonadEffect m => String -> m (Producer (Maybe Buffer) Aff Unit)
reader a = liftEffect $ S.fromReadable <$> O.fromBufferReadable <$> FS.Stream.createReadStream a
reader a = liftEffect $ Pipe.Node.fromReadable <$> O.unsafeCoerceReadable <$> FS.Stream.createReadStream a
spec :: Spec Unit
spec =
@@ -52,30 +75,30 @@ spec =
describe "Readable" do
describe "Readable.from(<Iterable>)" do
it "empty" do
vals <- Pipes.toListM $ (S.fromReadable $ readableFromArray @{ foo :: String } []) >-> S.unEOS
vals <- Pipe.toListM $ (Pipe.Node.fromReadable $ readableFromArray @{ foo :: String } []) >-> Pipe.Node.unEOS
vals `shouldEqual` List.Nil
it "singleton" do
vals <- Pipes.toListM $ (S.fromReadable $ readableFromArray @{ foo :: String } [ { foo: "1" } ]) >-> S.unEOS
vals <- Pipe.toListM $ (Pipe.Node.fromReadable $ readableFromArray @{ foo :: String } [ { foo: "1" } ]) >-> Pipe.Node.unEOS
vals `shouldEqual` ({ foo: "1" } : List.Nil)
it "many elements" do
let exp = (\n -> { foo: show n }) <$> Array.range 0 100
vals <- Pipes.toListM $ (S.fromReadable $ readableFromArray exp) >-> S.unEOS
vals <- Pipe.toListM $ (Pipe.Node.fromReadable $ readableFromArray exp) >-> Pipe.Node.unEOS
vals `shouldEqual` (List.fromFoldable exp)
describe "Writable" $ around tmpFile do
describe "fs.WriteStream" do
it "pipe to file" \p -> do
stream <- O.fromBufferWritable <$> liftEffect (FS.Stream.createWriteStream p)
stream <- O.unsafeCoerceWritable <$> liftEffect (FS.Stream.createWriteStream p)
let
w = S.fromWritable stream
w = Pipe.Node.fromWritable stream
source = do
buf <- liftEffect $ Buffer.fromString "hello" UTF8
yield buf
runEffect $ S.withEOS source >-> w
runEffect $ Pipe.Node.withEOS source >-> w
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` "hello"
shouldEqual true =<< liftEffect (O.isWritableEnded stream)
it "async pipe to file" \p -> do
w <- S.fromWritable <$> O.fromBufferWritable <$> liftEffect (FS.Stream.createWriteStream p)
w <- Pipe.Node.fromWritable <$> O.unsafeCoerceWritable <$> liftEffect (FS.Stream.createWriteStream p)
let
source = do
yield "hello, "
@@ -87,7 +110,7 @@ spec =
yield "this is a "
lift $ delay $ wrap 5.0
yield "test."
runEffect $ S.withEOS (source >-> Pipes.Buffer.fromString UTF8) >-> w
runEffect $ Pipe.Node.withEOS (source >-> Pipe.Buffer.fromString UTF8) >-> w
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` "hello, world! this is a test."
it "chained pipes" \p -> do
@@ -101,40 +124,96 @@ spec =
let
exp = fold (writeJSON <$> objs)
stream /\ w <- liftEffect $ writer p
runEffect $ S.withEOS (Pipes.each objs >-> jsonStringify >-> Pipes.Buffer.fromString UTF8) >-> w
runEffect $ Pipe.Node.withEOS (Pipe.each objs >-> jsonStringify >-> Pipe.Buffer.fromString UTF8) >-> w
contents <- liftEffect $ FS.readTextFile UTF8 p
contents `shouldEqual` exp
shouldEqual true =<< liftEffect (O.isWritableEnded stream)
describe "Transform" do
it "gzip" do
let
json = yield $ writeJSON { foo: "bar" }
exp = "1f8b0800000000000003ab564acbcf57b2524a4a2c52aa0500eff52bfe0d000000"
gzip <- S.fromTransform <$> O.fromBufferTransform <$> liftEffect (Zlib.toDuplex <$> Zlib.createGzip)
outs :: List.List String <- Pipes.toListM (S.withEOS (json >-> Pipes.Buffer.fromString UTF8) >-> gzip >-> S.unEOS >-> Pipes.Buffer.toString Hex)
fold outs `shouldEqual` exp
around tmpFiles
$ it "file >-> gzip >-> file >-> gunzip" \(a /\ b) -> do
liftEffect $ FS.writeTextFile UTF8 a $ writeJSON [ 1, 2, 3, 4 ]
areader <- liftEffect $ reader a
bwritestream /\ bwriter <- liftEffect $ writer b
gzip <- S.fromTransform <$> O.fromBufferTransform <$> liftEffect (Zlib.toDuplex <$> Zlib.createGzip)
runEffect $ areader >-> gzip >-> bwriter
shouldEqual true =<< liftEffect (O.isWritableEnded bwritestream)
let
bignums = Array.range 1 1000
firstNames = String.split (wrap "\n") $ unsafePerformEffect (FS.readTextFile UTF8 "./test/Test/first_names.txt")
lastNames = String.split (wrap "\n") $ unsafePerformEffect (FS.readTextFile UTF8 "./test/Test/last_names.txt")
names n = do
first <- firstNames
last <- Array.take (Int.round $ Int.toNumber n / Int.toNumber (Array.length firstNames)) lastNames
pure $ first <> " " <> last
people n = mapWithIndex (\ix name -> {id: show $ ix + 1, name}) (names n)
peopleCSV n = "id,name\n" <> intercalate "\n" ((\{id, name} -> id <> "," <> name) <$> people n)
gunzip <- S.fromTransform <$> O.fromBufferTransform <$> liftEffect (Zlib.toDuplex <$> Zlib.createGunzip)
breader <- liftEffect $ reader b
nums <- Pipes.toListM (breader >-> gunzip >-> S.unEOS >-> Pipes.Buffer.toString UTF8 >-> jsonParse @(Array Int) >-> Pipes.mapFoldable identity)
Array.fromFoldable nums `shouldEqual` [ 1, 2, 3, 4 ]
for_ [4000, 8000, 32000, 64000, 200000] \n -> do
let
csv = peopleCSV n
people' = people n
around tmpFiles
$ it (show n <> " row csv >-/-> csv-parse >-/-> cborEncode") \(a /\ _) -> do
liftEffect $ FS.writeTextFile UTF8 a csv
cbor :: Buffer <- Pipe.Collect.toBuffer
$ Pipe.FS.read a
>-> Pipe.Node.inEOS (Pipe.Buffer.toString UTF8)
>-/-> Pipe.Node.fromTransform csvDecode
>-/-> Pipe.Node.fromTransform cborEncode
>-> Pipe.Node.unEOS
f :: Array Foreign <- liftEffect $ cborDecodeSync cbor
ppl <- traverse (liftEither <<< lmap (error <<< show) <<< runExcept <<< readImpl) f
ppl `shouldEqual` people'
around tmpFiles
$ it (show n <> " row csv >-> sync csv-parse >-> sync cborEncode") \(a /\ _) -> do
liftEffect $ FS.writeTextFile UTF8 a csv
cbor :: Buffer <- Pipe.Collect.toBuffer
$ Pipe.FS.read a
>-> Pipe.Node.inEOS (Pipe.Buffer.toString UTF8)
>-> sync (Pipe.Node.fromTransform csvDecode)
>-> sync (Pipe.Node.fromTransform cborEncode)
>-> Pipe.Node.unEOS
f :: Array Foreign <- liftEffect $ cborDecodeSync cbor
ppl <- traverse (liftEither <<< lmap (error <<< show) <<< runExcept <<< readImpl) f
ppl `shouldEqual` people'
around tmpFiles
$ it "file >-> sync gzip >-> sync gunzip" \(a /\ _) -> do
liftEffect $ FS.writeTextFile UTF8 a $ writeJSON bignums
json <- Pipe.Collect.toMonoid
$ Pipe.FS.read a
>-> sync Pipe.Zlib.gzip
>-> sync Pipe.Zlib.gunzip
>-> Pipe.Node.unEOS
>-> Pipe.Buffer.toString UTF8
readJSON json `shouldEqual` (Right bignums)
around tmpFiles
$ it "file >-/-> gzip >-/-> slow >-/-> gunzip" \(a /\ _) -> do
liftEffect $ FS.writeTextFile UTF8 a $ writeJSON bignums
json <-
Pipe.Collect.toMonoid
$ Pipe.FS.read a
>-/-> Pipe.Zlib.gzip
>-/-> Pipe.Node.fromTransform slowTransform
>-/-> Pipe.Zlib.gunzip
>-> Pipe.Node.unEOS
>-> Pipe.Buffer.toString UTF8
readJSON json `shouldEqual` (Right bignums)
around tmpFiles
$ it "file >-> sync gzip >-> sync slow >-> sync gunzip" \(a /\ _) -> do
liftEffect $ FS.writeTextFile UTF8 a $ writeJSON bignums
json <-
Pipe.Collect.toMonoid
$ Pipe.FS.read a
>-> sync Pipe.Zlib.gzip
>-> sync (Pipe.Node.fromTransform slowTransform)
>-> sync Pipe.Zlib.gunzip
>-> Pipe.Node.unEOS
>-> Pipe.Buffer.toString UTF8
readJSON json `shouldEqual` (Right bignums)
around tmpFile $ it "file >-> discardTransform" \(p :: String) -> do
liftEffect $ FS.writeTextFile UTF8 p "foo"
r <- reader p
discard' <- liftEffect discardTransform
out :: List.List Int <- Pipes.toListM $ r >-> S.fromTransform discard' >-> S.unEOS
out :: List.List Int <- Pipe.toListM $ r >-/-> Pipe.Node.fromTransform discardTransform >-> Pipe.Node.unEOS
out `shouldEqual` List.Nil
around tmpFile $ it "file >-> charsTransform" \(p :: String) -> do
liftEffect $ FS.writeTextFile UTF8 p "foo bar"
r <- reader p
chars' <- liftEffect charsTransform
out :: List.List String <- Pipes.toListM $ r >-> S.inEOS (Pipes.Buffer.toString UTF8) >-> S.fromTransform chars' >-> S.unEOS
out :: List.List String <- Pipe.toListM $ r >-> Pipe.Node.inEOS (Pipe.Buffer.toString UTF8) >-/-> Pipe.Node.fromTransform charsTransform >-> Pipe.Node.unEOS
out `shouldEqual` List.fromFoldable [ "f", "o", "o", " ", "b", "a", "r" ]

4095
test/Test/first_names.txt Normal file

File diff suppressed because it is too large Load Diff

4096
test/Test/last_names.txt Normal file

File diff suppressed because it is too large Load Diff