Projektstart
This commit is contained in:
21
backend/node_modules/msgpackr/LICENSE
generated
vendored
Normal file
21
backend/node_modules/msgpackr/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2020 Kris Zyp
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
372
backend/node_modules/msgpackr/README.md
generated
vendored
Normal file
372
backend/node_modules/msgpackr/README.md
generated
vendored
Normal file
@@ -0,0 +1,372 @@
|
||||
# msgpackr
|
||||
[](https://www.npmjs.org/package/msgpackr)
|
||||
[](https://www.npmjs.org/package/msgpackr)
|
||||
[](benchmark.md)
|
||||
[](benchmark.md)
|
||||
[](README.md)
|
||||
[](README.md)
|
||||
[](LICENSE)
|
||||
|
||||
<img align="right" src="./assets/performance.png" width="380"/>
|
||||
|
||||
The msgpackr package is an extremely fast MessagePack NodeJS/JavaScript implementation. Currently, it is significantly faster than any other known implementations, faster than Avro (for JS), and generally faster than native V8 JSON.stringify/parse, on NodeJS. It also includes an optional record extension (the `r` in msgpackr), for defining record structures that makes MessagePack even faster and more compact, often over twice as fast as even native JSON functions, several times faster than other JS implementations, and 15-50% more compact. See the performance section for more details. Structured cloning (with support for cyclical references) is also supported through optional extensions.
|
||||
|
||||
## Basic Usage
|
||||
|
||||
Install with:
|
||||
|
||||
```
|
||||
npm i msgpackr
|
||||
```
|
||||
And `import` or `require` it for basic standard serialization/encoding (`pack`) and deserialization/decoding (`unpack`) functions:
|
||||
```js
|
||||
import { unpack, pack } from 'msgpackr';
|
||||
let serializedAsBuffer = pack(value);
|
||||
let data = unpack(serializedAsBuffer);
|
||||
```
|
||||
This `pack` function will generate standard MessagePack without any extensions that should be compatible with any standard MessagePack parser/decoder. It will serialize JavaScript objects as MessagePack `map`s by default. The `unpack` function will deserialize MessagePack `map`s as an `Object` with the properties from the map.
|
||||
|
||||
## Node Usage
|
||||
The msgpackr package runs on any modern JS platform, but is optimized for NodeJS usage (and will use a node addon for performance boost as an optional dependency).
|
||||
|
||||
### Streams
|
||||
We can use the including streaming functionality (which further improves performance). The `PackrStream` is a NodeJS transform stream that can be used to serialize objects to a binary stream (writing to network/socket, IPC, etc.), and the `UnpackrStream` can be used to deserialize objects from a binary sream (reading from network/socket, etc.):
|
||||
|
||||
```js
|
||||
import { PackrStream } from 'msgpackr';
|
||||
let stream = new PackrStream();
|
||||
stream.write(myData);
|
||||
|
||||
```
|
||||
Or for a full example of sending and receiving data on a stream:
|
||||
```js
|
||||
import { PackrStream, UnpackrStream } from 'msgpackr';
|
||||
let sendingStream = new PackrStream();
|
||||
let receivingStream = new UnpackrStream();
|
||||
// we are just piping to our own stream, but normally you would send and
|
||||
// receive over some type of inter-process or network connection.
|
||||
sendingStream.pipe(receivingStream);
|
||||
sendingStream.write(myData);
|
||||
receivingStream.on('data', (data) => {
|
||||
// received data
|
||||
});
|
||||
```
|
||||
The `PackrStream` and `UnpackrStream` instances will have also the record structure extension enabled by default (see below).
|
||||
|
||||
## Deno and Bun Usage
|
||||
Msgpackr modules are standard ESM modules and can be loaded directly from the [deno.land registry for msgpackr](https://deno.land/x/msgpackr) for use in Deno or using the NPM module loader with `import { unpack } from 'npm:msgpackr'`. The standard pack/encode and unpack/decode functionality is available on Deno, like other platforms. msgpackr can be used like any other package on Bun.
|
||||
|
||||
## Browser Usage
|
||||
Msgpackr works as standalone JavaScript as well, and runs on modern browsers. It includes a bundled script, at `dist/index.js` for ease of direct loading:
|
||||
```html
|
||||
<script src="node_modules/msgpackr/dist/index.js"></script>
|
||||
```
|
||||
|
||||
This is UMD based, and will register as a module if possible, or create a `msgpackr` global with all the exported functions.
|
||||
|
||||
For module-based development, it is recommended that you directly import the module of interest, to minimize dependencies that get pulled into your application:
|
||||
```js
|
||||
import { unpack } from 'msgpackr/unpack' // if you only need to unpack
|
||||
```
|
||||
|
||||
The package also includes a minified bundle in index.min.js.
|
||||
Additionally, the package includes a version that excludes dynamic code evaluation called index-no-eval.js, for situations where Content Security Policy (CSP) forbids eval/Function in code. The dynamic evaluation provides important performance optimizations (for records), so is not recommended unless required by CSP policy.
|
||||
|
||||
## Structured Cloning
|
||||
You can also use msgpackr for [structured cloning](https://html.spec.whatwg.org/multipage/structured-data.html). By enabling the `structuredClone` option, you can include references to other objects or cyclic references, and object identity will be preserved. Structured cloning also enables preserving certain typed objects like `Error`, `Set`, `RegExp` and TypedArray instances. For example:
|
||||
```js
|
||||
let obj = {
|
||||
set: new Set(['a', 'b']),
|
||||
regular: /a\spattern/
|
||||
};
|
||||
obj.self = obj;
|
||||
let packr = new Packr({ structuredClone: true });
|
||||
let serialized = packr.pack(obj);
|
||||
let copy = packr.unpack(serialized);
|
||||
copy.self === copy // true
|
||||
copy.set.has('a') // true
|
||||
|
||||
```
|
||||
|
||||
This option is disabled by default because it uses extensions and reference checking degrades performance (by about 25-30%). (Note this implementation doesn't serialize every class/type specified in the HTML specification since not all of them make sense for storing across platforms.)
|
||||
|
||||
### Alternate Terminology
|
||||
If you prefer to use encoder/decode terminology, msgpackr exports aliases, so `decode` is equivalent to `unpack`, `encode` is `pack`, `Encoder` is `Packr`, `Decoder` is `Unpackr`, and `EncoderStream` and `DecoderStream` can be used as well.
|
||||
|
||||
## Record / Object Structures
|
||||
There is a critical difference between maps (or dictionaries) that hold an arbitrary set of keys and values (JavaScript `Map` is designed for these), and records or object structures that have a well-defined set of fields. Typical JS objects/records may have many instances re(use) the same structure. By using the record extension, this distinction is preserved in MessagePack and the encoding can reuse structures and not only provides better type preservation, but yield much more compact encodings and increase decoding performance by 2-3x. Msgpackr automatically generates record definitions that are reused and referenced by objects with the same structure. There are a number of ways to use this to our advantage. For large object structures with repeating nested objects with similar structures, simply serializing with the record extension can yield significant benefits. To use the record structures extension, we create a new `Packr` instance. By default a new `Packr` instance will have the record extension enabled:
|
||||
```js
|
||||
import { Packr } from 'msgpackr';
|
||||
let packr = new Packr();
|
||||
packr.pack(bigDataWithLotsOfObjects);
|
||||
|
||||
```
|
||||
|
||||
Another way to further leverage the benefits of the msgpackr record structures is to use streams that naturally allow for data to reuse based on previous record structures. The stream classes have the record structure extension enabled by default and provide excellent out-of-the-box performance.
|
||||
|
||||
When creating a new `Packr`, `Unpackr`, `PackrStream`, or `UnpackrStream` instance, we can enable or disable the record structure extension with the `useRecords` property. When this is `false`, the record structure extension will be disabled (standard/compatibility mode), and all objects will revert to being serialized using MessageMap `map`s, and all `map`s will be deserialized to JS `Object`s as properties (like the standalone `pack` and `unpack` functions).
|
||||
|
||||
Streaming with record structures works by encoding a structure the first time it is seen in a stream and referencing the structure in later messages that are sent across that stream. When an encoder can expect a decoder to understand previous structure references, this can be configured using the `sequential: true` flag, which is auto-enabled by streams, but can also be used with Packr instances.
|
||||
|
||||
### Shared Record Structures
|
||||
Another useful way of using msgpackr, and the record extension, is for storing data in a databases, files, or other storage systems. If a number of objects with common data structures are being stored, a shared structure can be used to greatly improve data storage and deserialization efficiency. In the simplest form, provide a `structures` array, which is updated if any new object structure is encountered:
|
||||
```js
|
||||
import { Packr } from 'msgpackr';
|
||||
let packr = new Packr({
|
||||
structures: [... structures that were last generated ...]
|
||||
});
|
||||
```
|
||||
If you are working with persisted data, you will need to persist the `structures` data when it is updated. Msgpackr provides an API for loading and saving the `structures` on demand (which is robust and can be used in multiple-process situations where other processes may be updating this same `structures` array), we just need to provide a way to store the generated shared structure so it is available to deserialize stored data in the future:
|
||||
```js
|
||||
import { Packr } from 'msgpackr';
|
||||
let packr = new Packr({
|
||||
getStructures() {
|
||||
// storing our data in file (but we could also store in a db or key-value store)
|
||||
return unpack(readFileSync('my-shared-structures.mp')) || [];
|
||||
},
|
||||
saveStructures(structures) {
|
||||
writeFileSync('my-shared-structures.mp', pack(structures));
|
||||
}
|
||||
});
|
||||
```
|
||||
Msgpackr will automatically add and saves structures as it encounters any new object structures (up to a limit of 32, by default). It will always add structures in an incremental/compatible way: Any object encoded with an earlier structure can be decoded with a later version (as long as it is persisted).
|
||||
|
||||
#### Shared Structures Options
|
||||
By default there is a limit of 32 shared structures. This default is designed to record common shared structures, but also be resilient against sharing too many structures if there are many objects with dynamic properties that are likely to be repeated. This also allows for slightly more efficient one byte encoding. However, if your application has more structures that are commonly repeated, you can increase this limit by setting `maxSharedStructures` to a higher value. The maximum supported shared structures is 8160.
|
||||
|
||||
You can also provide a `shouldShareStructure` function in the options if you want to specifically indicate which structures should be shared. This is called during the encoding process with the array of keys for a structure that is being considered for addition to the shared structure. For example, you might want:
|
||||
```
|
||||
maxSharedStructures: 100,
|
||||
shouldShareStructure(keys) {
|
||||
return !(keys[0] > 1) // don't share structures that consist of numbers as keys
|
||||
}
|
||||
```
|
||||
|
||||
### Reading Multiple Values
|
||||
If you have a buffer with multiple values sequentially encoded, you can choose to parse and read multiple values. This can be done using the `unpackMultiple` function/method, which can return an array of all the values it can sequentially parse within the provided buffer. For example:
|
||||
```js
|
||||
let data = new Uint8Array([1, 2, 3]) // encodings of values 1, 2, and 3
|
||||
let values = unpackMultiple(data) // [1, 2, 3]
|
||||
```
|
||||
Alternately, you can provide a callback function that is called as the parsing occurs with each value, and can optionally terminate the parsing by returning `false`:
|
||||
```js
|
||||
let data = new Uint8Array([1, 2, 3]) // encodings of values 1, 2, and 3
|
||||
unpackMultiple(data, (value) => {
|
||||
// called for each value
|
||||
// return false if you wish to end the parsing
|
||||
})
|
||||
```
|
||||
|
||||
If you need to know the start and end offsets of the unpacked values, these are
|
||||
provided as optional parameters in the callback:
|
||||
```js
|
||||
let data = new Uint8Array([1, 2, 3]) // encodings of values 1, 2, and 3
|
||||
unpackMultiple(data, (value,start,end) => {
|
||||
// called for each value
|
||||
// `start` is the data buffer offset where the value was read from
|
||||
// `end` is `start` plus the byte length of the encoded value
|
||||
// return false if you wish to end the parsing
|
||||
})
|
||||
```
|
||||
|
||||
## Options
|
||||
The following options properties can be provided to the Packr or Unpackr constructor:
|
||||
|
||||
* `useRecords` - Setting this to `false` disables the record extension and stores JavaScript objects as MessagePack maps, and unpacks maps as JavaScript `Object`s, which ensures compatibilty with other decoders. Setting this to a function will use records for objects where `useRecords(object)` returns `true`.
|
||||
* `structures` - Provides the array of structures that is to be used for record extension, if you want the structures saved and used again. This array will be modified in place with new record structures that are serialized (if less than 32 structures are in the array).
|
||||
* `moreTypes` - Enable serialization of additional built-in types/classes including typed arrays, `Set`s, `Map`s, and `Error`s.
|
||||
* `structuredClone` - This enables the structured cloning extensions that will encode object/cyclic references. `moreTypes` is enabled by default when this is enabled.
|
||||
* `mapsAsObjects` - If `true`, this will decode MessagePack maps and JS `Object`s with the map entries decoded to object properties. If `false`, maps are decoded as JavaScript `Map`s. This is disabled by default if `useRecords` is enabled (which allows `Map`s to be preserved), and is enabled by default if `useRecords` is disabled.
|
||||
* `useFloat32` - This will enable msgpackr to encode non-integer numbers as `float32`. See next section for possible values.
|
||||
* `variableMapSize` - This will use varying map size definition (fixmap, map16, map32) based on the number of keys when encoding objects, which yields slightly more compact encodings (for small objects), but is typically 5-10% slower during encoding. This is necessary if you need to use objects with more than 65535 keys. This is only relevant when record extension is disabled.
|
||||
* `bundleStrings` - If `true` this uses a custom extension that bundles strings together, so that they can be decoded more quickly on browsers and Deno that do not have access to the NodeJS addon. This a custom extension, so both encoder and decoder need to support this. This can yield significant decoding performance increases on browsers (30%-50%).
|
||||
* `copyBuffers` - When decoding a MessagePack with binary data (Buffers are encoded as binary data), copy the buffer rather than providing a slice/view of the buffer. If you want your input data to be collected or modified while the decoded embedded buffer continues to live on, you can use this option (there is extra overhead to copying).
|
||||
* `useTimestamp32` - Encode JS `Date`s in 32-bit format when possible by dropping the milliseconds. This is a more efficient encoding of dates. You can also cause dates to use 32-bit format by manually setting the milliseconds to zero (`date.setMilliseconds(0)`).
|
||||
* `sequential` - Encode structures in serialized data, and reference previously encoded structures with expectation that decoder will read the encoded structures in the same order as encoded, with `unpackMultiple`.
|
||||
* `largeBigIntToFloat` - If a bigint needs to be encoded that is larger than will fit in 64-bit integers, it will be encoded as a float-64 (otherwise will throw a RangeError).
|
||||
* `largeBigIntToString` - If a bigint needs to be encoded that is larger than will fit in 64-bit integers, it will be encoded as a string (otherwise will throw a RangeError).
|
||||
* `useBigIntExtension` - If a bigint needs to be encoded that is larger than will fit in 64-bit integers, it will be encoded using a custom extension that supports up to about 1000-bits of integer precision.
|
||||
* `encodeUndefinedAsNil` - Encodes a value of `undefined` as a MessagePack `nil`, the same as a `null`.
|
||||
* `int64AsType` - This will decode uint64 and int64 numbers as the specified type. The type can be `bigint` (default), `number`, `string`, or `auto` (where range [-2^53...2^53] is represented by number and everything else by a bigint).
|
||||
* `skipValues` - This can be an array of property values that will indicate properties that should be skipped when serializing objects. For example, to mimic `JSON.stringify`'s behavior of skipping properties with a value of `undefined`, you can provide `skipValues: [undefined]`. Note, that this will only apply to serializing objects as standard MessagePack maps, not to records. Also, the array is checked by calling the `include` method, so you can provide an object with an `includes` if you want a custom function to skip values.
|
||||
* `onInvalidDate` - This can be provided as function that will be called when an invalid date is provided. The function can throw an error, or return a value that will be encoded in place of the invalid date. If not provided, an invalid date will be encoded as an invalid timestamp (which decodes with msgpackr back to an invalid date).
|
||||
* `writeFunction` - This can be provided as function that will be called when a function is encountered. The function can throw an error, or return a value that will be encoded in place of the function. If not provided, a function will be encoded as undefined (similar to `JSON.stringify`).
|
||||
* `mapAsEmptyObject` - Encodes JS `Map`s as empty objects (for back-compat with older libraries).
|
||||
* `setAsEmptyObject` - Encodes JS `Set`s as empty objects (for back-compat with older libraries).
|
||||
* `allowArraysInMapKeys` - Allows arrays to be used as keys in Maps, as long as all elements are strings, numbers, booleans, or bigints. When enabled, such arrays are flattened and converted to a string representation.
|
||||
|
||||
### 32-bit Float Options
|
||||
By default all non-integer numbers are serialized as 64-bit float (double). This is fast, and ensures maximum precision. However, often real-world data doesn't not need 64-bits of precision, and using 32-bit encoding can be much more space efficient. There are several options that provide more efficient encodings. Using the decimal rounding options for encoding and decoding provides lossless storage of common decimal representations like 7.99, in more efficient 32-bit format (rather than 64-bit). The `useFloat32` property has several possible options, available from the module as constants:
|
||||
```js
|
||||
import { FLOAT32_OPTIONS } from 'msgpackr';
|
||||
const { ALWAYS, DECIMAL_ROUND, DECIMAL_FIT } = FLOAT32_OPTIONS;
|
||||
```
|
||||
|
||||
* `ALWAYS` (1) - Always will encode non-integers (absolute less than 2147483648) as 32-bit float.
|
||||
* `DECIMAL_ROUND` (3) - Always will encode non-integers as 32-bit float, and when decoding 32-bit float, round to the significant decimal digits (usually 7, but 6 or 8 digits for some ranges).
|
||||
* `DECIMAL_FIT` (4) - Only encode non-integers as 32-bit float if all significant digits (usually up to 7) can be unambiguously encoded as a 32-bit float, and decode/unpack with decimal rounding (same as above). This will ensure round-trip encoding/decoding without loss in precision and uses 32-bit when possible.
|
||||
|
||||
Note, that the performance is decreased with decimal rounding by about 20-25%, although if only 5% of your values are floating point, that will only have about a 1% impact overall.
|
||||
|
||||
In addition, msgpackr exports a `roundFloat32(number)` function that can be used to round floating point numbers to the maximum significant decimal digits that can be stored in 32-bit float, just as DECIMAL_ROUND does when decoding. This can be useful for determining how a number will be decoded prior to encoding it.
|
||||
|
||||
## Performance
|
||||
### Native Acceleration
|
||||
Msgpackr employs an optional native node-addon to accelerate the parsing of strings. This should be automatically installed and utilized on NodeJS. However, you can verify this by checking the `isNativeAccelerationEnabled` property that is exported from msgpackr. If this is `false`, the `msgpackr-extract` package may not have been properly installed, and you may want to verify that it is installed correctly:
|
||||
```js
|
||||
import { isNativeAccelerationEnabled } from 'msgpackr'
|
||||
if (!isNativeAccelerationEnabled)
|
||||
console.warn('Native acceleration not enabled, verify that install finished properly')
|
||||
```
|
||||
|
||||
### Benchmarks
|
||||
Msgpackr is fast. Really fast. Here is comparison with the next fastest JS projects using the benchmark tool from `msgpack-lite` (and the sample data is from some clinical research data we use that has a good mix of different value types and structures). It also includes comparison to V8 native JSON functionality, and JavaScript Avro (`avsc`, a very optimized Avro implementation):
|
||||
|
||||
operation | op | ms | op/s
|
||||
---------------------------------------------------------- | ------: | ----: | -----:
|
||||
buf = Buffer(JSON.stringify(obj)); | 81600 | 5002 | 16313
|
||||
obj = JSON.parse(buf); | 90700 | 5004 | 18125
|
||||
require("msgpackr").pack(obj); | 169700 | 5000 | 33940
|
||||
require("msgpackr").unpack(buf); | 109700 | 5003 | 21926
|
||||
msgpackr w/ shared structures: packr.pack(obj); | 190400 | 5001 | 38072
|
||||
msgpackr w/ shared structures: packr.unpack(buf); | 422900 | 5000 | 84580
|
||||
buf = require("msgpack-lite").encode(obj); | 31300 | 5005 | 6253
|
||||
obj = require("msgpack-lite").decode(buf); | 15700 | 5007 | 3135
|
||||
buf = require("@msgpack/msgpack").encode(obj); | 103100 | 5003 | 20607
|
||||
obj = require("@msgpack/msgpack").decode(buf); | 59100 | 5004 | 11810
|
||||
buf = require("notepack").encode(obj); | 65500 | 5007 | 13081
|
||||
obj = require("notepack").decode(buf); | 33400 | 5009 | 6667
|
||||
obj = require("msgpack-unpack").decode(buf); | 6900 | 5036 | 1370
|
||||
require("avsc")...make schema/type...type.toBuffer(obj); | 89300 | 5005 | 17842
|
||||
require("avsc")...make schema/type...type.fromBuffer(obj); | 108400 | 5001 | 21675
|
||||
|
||||
All benchmarks were performed on Node 15 / V8 8.6 (Windows i7-4770 3.4Ghz).
|
||||
(`avsc` is schema-based and more comparable in style to msgpackr with shared structures).
|
||||
|
||||
Here is a benchmark of streaming data (again borrowed from `msgpack-lite`'s benchmarking), where msgpackr is able to take advantage of the structured record extension and really demonstrate its performance capabilities:
|
||||
|
||||
operation (1000000 x 2) | op | ms | op/s
|
||||
------------------------------------------------ | ------: | ----: | -----:
|
||||
new PackrStream().write(obj); | 1000000 | 372 | 2688172
|
||||
new UnpackrStream().write(buf); | 1000000 | 247 | 4048582
|
||||
stream.write(msgpack.encode(obj)); | 1000000 | 2898 | 345065
|
||||
stream.write(msgpack.decode(buf)); | 1000000 | 1969 | 507872
|
||||
stream.write(notepack.encode(obj)); | 1000000 | 901 | 1109877
|
||||
stream.write(notepack.decode(buf)); | 1000000 | 1012 | 988142
|
||||
msgpack.Encoder().on("data",ondata).encode(obj); | 1000000 | 1763 | 567214
|
||||
msgpack.createDecodeStream().write(buf); | 1000000 | 2222 | 450045
|
||||
msgpack.createEncodeStream().write(obj); | 1000000 | 1577 | 634115
|
||||
msgpack.Decoder().on("data",ondata).decode(buf); | 1000000 | 2246 | 445235
|
||||
|
||||
See the [benchmark.md](benchmark.md) for more benchmarks and information about benchmarking.
|
||||
|
||||
## Custom Extensions
|
||||
You can add your own custom extensions, which can be used to encode specific types/classes in certain ways. This is done by using the `addExtension` function, and specifying the class, extension `type` code (should be a number from 1-100, reserving negatives for MessagePack, 101-127 for msgpackr), and your `pack` and `unpack` functions (or just the one you need).
|
||||
```js
|
||||
import { addExtension, Packr } from 'msgpackr';
|
||||
|
||||
class MyCustomClass {...}
|
||||
|
||||
let extPackr = new Packr();
|
||||
addExtension({
|
||||
Class: MyCustomClass,
|
||||
type: 11, // register your own extension code (a type code from 1-100)
|
||||
pack(instance) {
|
||||
// define how your custom class should be encoded
|
||||
return Buffer.from([instance.myData]); // return a buffer
|
||||
},
|
||||
unpack(buffer) {
|
||||
// define how your custom class should be decoded
|
||||
let instance = new MyCustomClass();
|
||||
instance.myData = buffer[0];
|
||||
return instance; // decoded value from buffer
|
||||
}
|
||||
});
|
||||
```
|
||||
If you want to use msgpackr to encode and decode the data within your extensions, you can use the `read` and `write` functions and read and write data/objects that will be encoded and decoded by msgpackr, which can be easier and faster than creating and receiving separate buffers:
|
||||
|
||||
```js
|
||||
import { addExtension, Packr } from 'msgpackr';
|
||||
|
||||
class MyCustomClass {...}
|
||||
|
||||
let extPackr = new Packr();
|
||||
addExtension({
|
||||
Class: MyCustomClass,
|
||||
type: 11, // register your own extension code (a type code from 1-100)
|
||||
write(instance) {
|
||||
// define how your custom class should be encoded
|
||||
return instance.myData; // return some data to be encoded
|
||||
}
|
||||
read(data) {
|
||||
// define how your custom class should be decoded,
|
||||
// data will already be unpacked/decoded
|
||||
let instance = new MyCustomClass();
|
||||
instance.myData = data;
|
||||
return instance; // return decoded value
|
||||
}
|
||||
});
|
||||
```
|
||||
Note that you can just return the same object from `write`, and in this case msgpackr will encode it using the default object/array encoding:
|
||||
```js
|
||||
addExtension({
|
||||
Class: MyCustomClass,
|
||||
type: 12,
|
||||
read: function(data) {
|
||||
Object.setPrototypeOf(data, MyCustomClass.prototype)
|
||||
return data
|
||||
},
|
||||
write: function(data) {
|
||||
return data
|
||||
}
|
||||
})
|
||||
```
|
||||
You can also create an extension with `Class` and `write` methods, but no `type` (or `read`), if you just want to customize how a class is serialized without using MessagePack extension encoding.
|
||||
|
||||
### Additional Performance Optimizations
|
||||
Msgpackr is already fast, but here are some tips for making it faster:
|
||||
|
||||
#### Buffer Reuse
|
||||
Msgpackr is designed to work well with reusable buffers. Allocating new buffers can be relatively expensive, so if you have Node addons, it can be much faster to reuse buffers and use memcpy to copy data into existing buffers. Then msgpackr `unpack` can be executed on the same buffer, with new data, and optionally take a second paramter indicating the effective size of the available data in the buffer.
|
||||
|
||||
#### Arena Allocation (`useBuffer()`)
|
||||
During the serialization process, data is written to buffers. Again, allocating new buffers is a relatively expensive process, and the `useBuffer` method can help allow reuse of buffers that will further improve performance. With `useBuffer` method, you can provide a buffer, serialize data into it, and when it is known that you are done using that buffer, you can call `useBuffer` again to reuse it. The use of `useBuffer` is never required, buffers will still be handled and cleaned up through GC if not used, it just provides a small performance boost.
|
||||
|
||||
## Record Structure Extension Definition
|
||||
The record struction extension uses extension id 0x72 ("r") to declare the use of this functionality. The extension "data" byte (or bytes) identifies the byte or bytes used to identify the start of a record in the subsequent MessagePack block or stream. The identifier byte (or the first byte in a sequence) must be from 0x40 - 0x7f (and therefore replaces one byte representations of positive integers 64 - 127, which can alternately be represented with int or uint types). The extension declaration must be immediately follow by an MessagePack array that defines the field names of the record structure.
|
||||
|
||||
Once a record identifier and record field names have been defined, the parser/decoder should proceed to read the next value. Any subsequent use of the record identifier as a value in the block or stream should parsed as a record instance, and the next n values, where is n is the number of fields (as defined in the array of field names), should be read as the values of the fields. For example, here we have defined a structure with fields "foo" and "bar", with the record identifier 0x40, and then read a record instance that defines the field values of 4 and 2, respectively:
|
||||
```
|
||||
+--------+--------+--------+~~~~~~~~~~~~~~~~~~~~~~~~~+--------+--------+
|
||||
| 0xd4 | 0x72 | 0x40 | array: [ "foo", "bar" ] | 0x04 | 0x02 |
|
||||
+--------+--------+--------+~~~~~~~~~~~~~~~~~~~~~~~~~+--------+--------+
|
||||
```
|
||||
Which should generate an object that would correspond to JSON:
|
||||
```js
|
||||
{ "foo": 4, "bar": 2}
|
||||
```
|
||||
|
||||
## Additional value types
|
||||
msgpackr supports `undefined` (using fixext1 + type: 0 + data: 0 to match other JS implementations), `NaN`, `Infinity`, and `-Infinity` (using standard IEEE 754 representations with doubles/floats).
|
||||
|
||||
### Dates
|
||||
msgpackr saves all JavaScript `Date`s using the standard MessagePack date extension (type -1), using the smallest of 32-bit, 64-bit or 96-bit format needed to store the date without data loss (or using 32-bit if useTimestamp32 options is specified).
|
||||
|
||||
### Structured Cloning
|
||||
With structured cloning enabled, msgpackr will also use extensions to store Set, Map, Error, RegExp, ArrayBufferView objects and preserve their types.
|
||||
|
||||
## Alternate Encoding/Package
|
||||
The high-performance serialization and deserialization algorithms in the msgpackr package are also available in the [cbor-x](https://github.com/kriszyp/cbor-x) for the CBOR format, with the same API and design. A quick summary of the pros and cons of using MessagePack vs CBOR are:
|
||||
* MessagePack has wider adoption, and, at least with this implementation is slightly more efficient (by roughly 1%).
|
||||
* CBOR has an [official IETF standardization track](https://tools.ietf.org/html/rfc7049), and the record extensions is conceptually/philosophically a better fit for CBOR tags.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
|
||||
### Browser Consideration
|
||||
MessagePack can be a great choice for high-performance data delivery to browsers, as reasonable data size is possible without compression. And msgpackr works very well in modern browsers. However, it is worth noting that if you want highly compact data, brotli or gzip are most effective in compressing, and MessagePack's character frequency tends to defeat Huffman encoding used by these standard compression algorithms, resulting in less compact data than compressed JSON.
|
||||
|
||||
### Credits
|
||||
|
||||
Various projects have been inspirations for this, and code has been borrowed from https://github.com/msgpack/msgpack-javascript and https://github.com/mtth/avsc.
|
||||
11
backend/node_modules/msgpackr/SECURITY.md
generated
vendored
Normal file
11
backend/node_modules/msgpackr/SECURITY.md
generated
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
# Security Policy
|
||||
|
||||
## Supported Versions
|
||||
|
||||
| Version | Supported |
|
||||
| ------- | ------------------ |
|
||||
| 1.4.x | :white_check_mark: |
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
Please report security vulnerabilities to kriszyp@gmail.com.
|
||||
67
backend/node_modules/msgpackr/benchmark.md
generated
vendored
Normal file
67
backend/node_modules/msgpackr/benchmark.md
generated
vendored
Normal file
@@ -0,0 +1,67 @@
|
||||
Here are more comprehensive benchmarks. This is comparison with the next fastest JS projects using the benchmark tool from `msgpack-lite` (and data is from some clinical research data we use that has a good mix of different value types and structures). It also includes comparison to V8 native JSON functionality, and JavaScript Avro (`avsc`, a very optimized Avro implementation):
|
||||
|
||||
operation | op | ms | op/s
|
||||
---------------------------------------------------------- | ------: | ----: | -----:
|
||||
buf = Buffer(JSON.stringify(obj)); | 82000 | 5004 | 16386
|
||||
obj = JSON.parse(buf); | 88600 | 5000 | 17720
|
||||
require("msgpackr").pack(obj); | 161500 | 5002 | 32287
|
||||
require("msgpackr").unpack(buf); | 94600 | 5004 | 18904
|
||||
msgpackr w/ shared structures: packr.pack(obj); | 178400 | 5002 | 35665
|
||||
msgpackr w/ shared structures: packr.unpack(buf); | 376700 | 5000 | 75340
|
||||
buf = require("msgpack-lite").encode(obj); | 30100 | 5012 | 6005
|
||||
obj = require("msgpack-lite").decode(buf); | 16200 | 5001 | 3239
|
||||
buf = require("notepack").encode(obj); | 62600 | 5005 | 12507
|
||||
obj = require("notepack").decode(buf); | 32400 | 5007 | 6470
|
||||
require("what-the-pack")... encoder.encode(obj); | 63500 | 5002 | 12694
|
||||
require("what-the-pack")... encoder.decode(buf); | 32000 | 5001 | 6398
|
||||
require("avsc")...make schema/type...type.toBuffer(obj); | 84600 | 5003 | 16909
|
||||
require("avsc")...make schema/type...type.toBuffer(obj); | 99300 | 5001 | 19856
|
||||
|
||||
(`avsc` is schema-based and more comparable in style to msgpackr with shared structures).
|
||||
|
||||
Here is a benchmark of streaming data (again borrowed from `msgpack-lite`'s benchmarking), where msgpackr is able to take advantage of the structured record extension and really pull away from other tools:
|
||||
|
||||
operation (1000000 x 2) | op | ms | op/s
|
||||
------------------------------------------------ | ------: | ----: | -----:
|
||||
new PackrStream().write(obj); | 1000000 | 372 | 2688172
|
||||
new UnpackrStream().write(buf); | 1000000 | 247 | 4048582
|
||||
stream.write(msgpack.encode(obj)); | 1000000 | 2898 | 345065
|
||||
stream.write(msgpack.decode(buf)); | 1000000 | 1969 | 507872
|
||||
stream.write(notepack.encode(obj)); | 1000000 | 901 | 1109877
|
||||
stream.write(notepack.decode(buf)); | 1000000 | 1012 | 988142
|
||||
msgpack.Encoder().on("data",ondata).encode(obj); | 1000000 | 1763 | 567214
|
||||
msgpack.createDecodeStream().write(buf); | 1000000 | 2222 | 450045
|
||||
msgpack.createEncodeStream().write(obj); | 1000000 | 1577 | 634115
|
||||
msgpack.Decoder().on("data",ondata).decode(buf); | 1000000 | 2246 | 445235
|
||||
|
||||
|
||||
|
||||
These are the benchmarks from notepack package. The larger test data for these benchmarks is very heavily weighted with large binary/buffer data and objects with extreme numbers of keys (much more than I typically see with real-world data, but YMMV):
|
||||
|
||||
node ./benchmarks/encode
|
||||
|
||||
library | tiny | small | medium | large
|
||||
---------------- | ----------------: | --------------: | ---------------| -------:
|
||||
notepack | 2,171,621 ops/sec | 546,905 ops/sec | 29,578 ops/sec | 265 ops/sec
|
||||
msgpack-js | 967,682 ops/sec | 184,455 ops/sec | 20,556 ops/sec | 259 ops/sec
|
||||
msgpackr | 2,392,826 ops/sec | 556,915 ops/sec | 70,573 ops/sec | 313 ops/sec
|
||||
msgpack-lite | 553,143 ops/sec | 132,318 ops/sec | 11,816 ops/sec | 186 ops/sec
|
||||
@msgpack/msgpack | 2,157,655 ops/sec | 573,236 ops/sec | 25,864 ops/sec | 90.26 ops/sec
|
||||
|
||||
|
||||
node ./benchmarks/decode
|
||||
|
||||
library | tiny | small | medium | large
|
||||
---------------- | ----------------: | --------------: | --------------- | -------:
|
||||
notepack | 2,220,904 ops/sec | 560,630 ops/sec | 28,177 ops/sec | 275 ops/sec
|
||||
msgpack-js | 965,719 ops/sec | 222,047 ops/sec | 21,431 ops/sec | 257 ops/sec
|
||||
msgpackr | 2,320,046 ops/sec | 589,167 ops/sec | 70,299 ops/sec | 329 ops/sec
|
||||
msgpackr records | 3,750,547 ops/sec | 912,419 ops/sec | 136,853 ops/sec | 733 ops/sec
|
||||
msgpack-lite | 569,222 ops/sec | 129,008 ops/sec | 12,424 ops/sec | 180 ops/sec
|
||||
@msgpack/msgpack | 2,089,697 ops/sec | 557,507 ops/sec | 20,256 ops/sec | 85.03 ops/sec
|
||||
|
||||
This was run by adding the msgpackr to the benchmarks for notepack.
|
||||
|
||||
All benchmarks were performed on Node 14.8.0 (Windows i7-4770 3.4Ghz). They can be run with:
|
||||
npm install --no-save msgpack msgpack-js @msgpack/msgpack msgpack-lite notepack avsc
|
||||
node tests/benchmark
|
||||
2403
backend/node_modules/msgpackr/dist/index-no-eval.cjs
generated
vendored
Normal file
2403
backend/node_modules/msgpackr/dist/index-no-eval.cjs
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
backend/node_modules/msgpackr/dist/index-no-eval.cjs.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/index-no-eval.cjs.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
2
backend/node_modules/msgpackr/dist/index-no-eval.min.js
generated
vendored
Normal file
2
backend/node_modules/msgpackr/dist/index-no-eval.min.js
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
1
backend/node_modules/msgpackr/dist/index-no-eval.min.js.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/index-no-eval.min.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
2402
backend/node_modules/msgpackr/dist/index.js
generated
vendored
Normal file
2402
backend/node_modules/msgpackr/dist/index.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
backend/node_modules/msgpackr/dist/index.js.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/index.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
2
backend/node_modules/msgpackr/dist/index.min.js
generated
vendored
Normal file
2
backend/node_modules/msgpackr/dist/index.min.js
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
1
backend/node_modules/msgpackr/dist/index.min.js.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/index.min.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
3313
backend/node_modules/msgpackr/dist/node.cjs
generated
vendored
Normal file
3313
backend/node_modules/msgpackr/dist/node.cjs
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
backend/node_modules/msgpackr/dist/node.cjs.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/node.cjs.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
4536
backend/node_modules/msgpackr/dist/test.js
generated
vendored
Normal file
4536
backend/node_modules/msgpackr/dist/test.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
backend/node_modules/msgpackr/dist/test.js.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/test.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
1250
backend/node_modules/msgpackr/dist/unpack-no-eval.cjs
generated
vendored
Normal file
1250
backend/node_modules/msgpackr/dist/unpack-no-eval.cjs
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
backend/node_modules/msgpackr/dist/unpack-no-eval.cjs.map
generated
vendored
Normal file
1
backend/node_modules/msgpackr/dist/unpack-no-eval.cjs.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
90
backend/node_modules/msgpackr/index.d.cts
generated
vendored
Normal file
90
backend/node_modules/msgpackr/index.d.cts
generated
vendored
Normal file
@@ -0,0 +1,90 @@
|
||||
export enum FLOAT32_OPTIONS {
|
||||
NEVER = 0,
|
||||
ALWAYS = 1,
|
||||
DECIMAL_ROUND = 3,
|
||||
DECIMAL_FIT = 4
|
||||
}
|
||||
|
||||
export interface Options {
|
||||
useFloat32?: FLOAT32_OPTIONS
|
||||
useRecords?: boolean | ((value:any)=> boolean)
|
||||
structures?: {}[]
|
||||
moreTypes?: boolean
|
||||
sequential?: boolean
|
||||
structuredClone?: boolean
|
||||
mapsAsObjects?: boolean
|
||||
variableMapSize?: boolean
|
||||
coercibleKeyAsNumber?: boolean
|
||||
copyBuffers?: boolean
|
||||
bundleStrings?: boolean
|
||||
useTimestamp32?: boolean
|
||||
largeBigIntToFloat?: boolean
|
||||
largeBigIntToString?: boolean
|
||||
useBigIntExtension?: boolean
|
||||
encodeUndefinedAsNil?: boolean
|
||||
maxSharedStructures?: number
|
||||
maxOwnStructures?: number
|
||||
mapAsEmptyObject?: boolean
|
||||
setAsEmptyObject?: boolean
|
||||
allowArraysInMapKeys?: boolean
|
||||
writeFunction?: () => any
|
||||
/** @deprecated use int64AsType: 'number' */
|
||||
int64AsNumber?: boolean
|
||||
int64AsType?: 'bigint' | 'number' | 'string'
|
||||
shouldShareStructure?: (keys: string[]) => boolean
|
||||
getStructures?(): {}[]
|
||||
saveStructures?(structures: {}[]): boolean | void
|
||||
onInvalidDate?: () => any
|
||||
}
|
||||
interface Extension {
|
||||
Class?: Function
|
||||
type?: number
|
||||
pack?(value: any): Buffer | Uint8Array
|
||||
unpack?(messagePack: Buffer | Uint8Array): any
|
||||
read?(datum: any): any
|
||||
write?(instance: any): any
|
||||
}
|
||||
export type UnpackOptions = { start?: number; end?: number; lazy?: boolean; } | number;
|
||||
export class Unpackr {
|
||||
constructor(options?: Options)
|
||||
unpack(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
decode(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
unpackMultiple(messagePack: Buffer | Uint8Array): any[]
|
||||
unpackMultiple(messagePack: Buffer | Uint8Array, forEach: (value: any, start?: number, end?: number) => any): void
|
||||
}
|
||||
export class Decoder extends Unpackr {}
|
||||
export function unpack(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
export function unpackMultiple(messagePack: Buffer | Uint8Array): any[]
|
||||
export function unpackMultiple(messagePack: Buffer | Uint8Array, forEach: (value: any, start?: number, end?: number) => any): void
|
||||
export function decode(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
export function addExtension(extension: Extension): void
|
||||
export function clearSource(): void
|
||||
export function roundFloat32(float32Number: number): number
|
||||
export const C1: {}
|
||||
export let isNativeAccelerationEnabled: boolean
|
||||
|
||||
export class Packr extends Unpackr {
|
||||
offset: number;
|
||||
position: number;
|
||||
pack(value: any, encodeOptions?: number): Buffer
|
||||
encode(value: any, encodeOptions?: number): Buffer
|
||||
useBuffer(buffer: Buffer | Uint8Array): void;
|
||||
clearSharedData(): void;
|
||||
}
|
||||
export class Encoder extends Packr {}
|
||||
export function pack(value: any, encodeOptions?: number): Buffer
|
||||
export function encode(value: any, encodeOptions?: number): Buffer
|
||||
|
||||
export const REUSE_BUFFER_MODE: number;
|
||||
export const RESET_BUFFER_MODE: number;
|
||||
export const RESERVE_START_SPACE: number;
|
||||
|
||||
import { Transform, Readable } from 'stream'
|
||||
|
||||
export as namespace msgpackr;
|
||||
export class UnpackrStream extends Transform {
|
||||
constructor(options?: Options | { highWaterMark: number, emitClose: boolean, allowHalfOpen: boolean })
|
||||
}
|
||||
export class PackrStream extends Transform {
|
||||
constructor(options?: Options | { highWaterMark: number, emitClose: boolean, allowHalfOpen: boolean })
|
||||
}
|
||||
90
backend/node_modules/msgpackr/index.d.ts
generated
vendored
Normal file
90
backend/node_modules/msgpackr/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,90 @@
|
||||
export enum FLOAT32_OPTIONS {
|
||||
NEVER = 0,
|
||||
ALWAYS = 1,
|
||||
DECIMAL_ROUND = 3,
|
||||
DECIMAL_FIT = 4
|
||||
}
|
||||
|
||||
export interface Options {
|
||||
useFloat32?: FLOAT32_OPTIONS
|
||||
useRecords?: boolean | ((value:any)=> boolean)
|
||||
structures?: {}[]
|
||||
moreTypes?: boolean
|
||||
sequential?: boolean
|
||||
structuredClone?: boolean
|
||||
mapsAsObjects?: boolean
|
||||
variableMapSize?: boolean
|
||||
coercibleKeyAsNumber?: boolean
|
||||
copyBuffers?: boolean
|
||||
bundleStrings?: boolean
|
||||
useTimestamp32?: boolean
|
||||
largeBigIntToFloat?: boolean
|
||||
largeBigIntToString?: boolean
|
||||
useBigIntExtension?: boolean
|
||||
encodeUndefinedAsNil?: boolean
|
||||
maxSharedStructures?: number
|
||||
maxOwnStructures?: number
|
||||
mapAsEmptyObject?: boolean
|
||||
setAsEmptyObject?: boolean
|
||||
allowArraysInMapKeys?: boolean
|
||||
writeFunction?: () => any
|
||||
/** @deprecated use int64AsType: 'number' */
|
||||
int64AsNumber?: boolean
|
||||
int64AsType?: 'bigint' | 'number' | 'string'
|
||||
shouldShareStructure?: (keys: string[]) => boolean
|
||||
getStructures?(): {}[]
|
||||
saveStructures?(structures: {}[]): boolean | void
|
||||
onInvalidDate?: () => any
|
||||
}
|
||||
interface Extension {
|
||||
Class?: Function
|
||||
type?: number
|
||||
pack?(value: any): Buffer | Uint8Array
|
||||
unpack?(messagePack: Buffer | Uint8Array): any
|
||||
read?(datum: any): any
|
||||
write?(instance: any): any
|
||||
}
|
||||
export type UnpackOptions = { start?: number; end?: number; lazy?: boolean; } | number;
|
||||
export class Unpackr {
|
||||
constructor(options?: Options)
|
||||
unpack(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
decode(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
unpackMultiple(messagePack: Buffer | Uint8Array): any[]
|
||||
unpackMultiple(messagePack: Buffer | Uint8Array, forEach: (value: any, start?: number, end?: number) => any): void
|
||||
}
|
||||
export class Decoder extends Unpackr {}
|
||||
export function unpack(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
export function unpackMultiple(messagePack: Buffer | Uint8Array): any[]
|
||||
export function unpackMultiple(messagePack: Buffer | Uint8Array, forEach: (value: any, start?: number, end?: number) => any): void
|
||||
export function decode(messagePack: Buffer | Uint8Array, options?: UnpackOptions): any
|
||||
export function addExtension(extension: Extension): void
|
||||
export function clearSource(): void
|
||||
export function roundFloat32(float32Number: number): number
|
||||
export const C1: {}
|
||||
export let isNativeAccelerationEnabled: boolean
|
||||
|
||||
export class Packr extends Unpackr {
|
||||
offset: number;
|
||||
position: number;
|
||||
pack(value: any, encodeOptions?: number): Buffer
|
||||
encode(value: any, encodeOptions?: number): Buffer
|
||||
useBuffer(buffer: Buffer | Uint8Array): void;
|
||||
clearSharedData(): void;
|
||||
}
|
||||
export class Encoder extends Packr {}
|
||||
export function pack(value: any, encodeOptions?: number): Buffer
|
||||
export function encode(value: any, encodeOptions?: number): Buffer
|
||||
|
||||
export const REUSE_BUFFER_MODE: number;
|
||||
export const RESET_BUFFER_MODE: number;
|
||||
export const RESERVE_START_SPACE: number;
|
||||
|
||||
import { Transform, Readable } from 'stream'
|
||||
|
||||
export as namespace msgpackr;
|
||||
export class UnpackrStream extends Transform {
|
||||
constructor(options?: Options | { highWaterMark: number, emitClose: boolean, allowHalfOpen: boolean })
|
||||
}
|
||||
export class PackrStream extends Transform {
|
||||
constructor(options?: Options | { highWaterMark: number, emitClose: boolean, allowHalfOpen: boolean })
|
||||
}
|
||||
5
backend/node_modules/msgpackr/index.js
generated
vendored
Normal file
5
backend/node_modules/msgpackr/index.js
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
export { Packr, Encoder, addExtension, pack, encode, NEVER, ALWAYS, DECIMAL_ROUND, DECIMAL_FIT, REUSE_BUFFER_MODE, RESET_BUFFER_MODE, RESERVE_START_SPACE } from './pack.js'
|
||||
export { Unpackr, Decoder, C1, unpack, unpackMultiple, decode, FLOAT32_OPTIONS, clearSource, roundFloat32, isNativeAccelerationEnabled } from './unpack.js'
|
||||
export { decodeIter, encodeIter } from './iterators.js'
|
||||
export const useRecords = false
|
||||
export const mapsAsObjects = true
|
||||
87
backend/node_modules/msgpackr/iterators.js
generated
vendored
Normal file
87
backend/node_modules/msgpackr/iterators.js
generated
vendored
Normal file
@@ -0,0 +1,87 @@
|
||||
import { Packr } from './pack.js'
|
||||
import { Unpackr } from './unpack.js'
|
||||
|
||||
/**
|
||||
* Given an Iterable first argument, returns an Iterable where each value is packed as a Buffer
|
||||
* If the argument is only Async Iterable, the return value will be an Async Iterable.
|
||||
* @param {Iterable|Iterator|AsyncIterable|AsyncIterator} objectIterator - iterable source, like a Readable object stream, an array, Set, or custom object
|
||||
* @param {options} [options] - msgpackr pack options
|
||||
* @returns {IterableIterator|Promise.<AsyncIterableIterator>}
|
||||
*/
|
||||
export function packIter (objectIterator, options = {}) {
|
||||
if (!objectIterator || typeof objectIterator !== 'object') {
|
||||
throw new Error('first argument must be an Iterable, Async Iterable, or a Promise for an Async Iterable')
|
||||
} else if (typeof objectIterator[Symbol.iterator] === 'function') {
|
||||
return packIterSync(objectIterator, options)
|
||||
} else if (typeof objectIterator.then === 'function' || typeof objectIterator[Symbol.asyncIterator] === 'function') {
|
||||
return packIterAsync(objectIterator, options)
|
||||
} else {
|
||||
throw new Error('first argument must be an Iterable, Async Iterable, Iterator, Async Iterator, or a Promise')
|
||||
}
|
||||
}
|
||||
|
||||
function * packIterSync (objectIterator, options) {
|
||||
const packr = new Packr(options)
|
||||
for (const value of objectIterator) {
|
||||
yield packr.pack(value)
|
||||
}
|
||||
}
|
||||
|
||||
async function * packIterAsync (objectIterator, options) {
|
||||
const packr = new Packr(options)
|
||||
for await (const value of objectIterator) {
|
||||
yield packr.pack(value)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Given an Iterable/Iterator input which yields buffers, returns an IterableIterator which yields sync decoded objects
|
||||
* Or, given an Async Iterable/Iterator which yields promises resolving in buffers, returns an AsyncIterableIterator.
|
||||
* @param {Iterable|Iterator|AsyncIterable|AsyncIterableIterator} bufferIterator
|
||||
* @param {object} [options] - unpackr options
|
||||
* @returns {IterableIterator|Promise.<AsyncIterableIterator}
|
||||
*/
|
||||
export function unpackIter (bufferIterator, options = {}) {
|
||||
if (!bufferIterator || typeof bufferIterator !== 'object') {
|
||||
throw new Error('first argument must be an Iterable, Async Iterable, Iterator, Async Iterator, or a promise')
|
||||
}
|
||||
|
||||
const unpackr = new Unpackr(options)
|
||||
let incomplete
|
||||
const parser = (chunk) => {
|
||||
let yields
|
||||
// if there's incomplete data from previous chunk, concatinate and try again
|
||||
if (incomplete) {
|
||||
chunk = Buffer.concat([incomplete, chunk])
|
||||
incomplete = undefined
|
||||
}
|
||||
|
||||
try {
|
||||
yields = unpackr.unpackMultiple(chunk)
|
||||
} catch (err) {
|
||||
if (err.incomplete) {
|
||||
incomplete = chunk.slice(err.lastPosition)
|
||||
yields = err.values
|
||||
} else {
|
||||
throw err
|
||||
}
|
||||
}
|
||||
return yields
|
||||
}
|
||||
|
||||
if (typeof bufferIterator[Symbol.iterator] === 'function') {
|
||||
return (function * iter () {
|
||||
for (const value of bufferIterator) {
|
||||
yield * parser(value)
|
||||
}
|
||||
})()
|
||||
} else if (typeof bufferIterator[Symbol.asyncIterator] === 'function') {
|
||||
return (async function * iter () {
|
||||
for await (const value of bufferIterator) {
|
||||
yield * parser(value)
|
||||
}
|
||||
})()
|
||||
}
|
||||
}
|
||||
export const decodeIter = unpackIter
|
||||
export const encodeIter = packIter
|
||||
25
backend/node_modules/msgpackr/node-index.js
generated
vendored
Normal file
25
backend/node_modules/msgpackr/node-index.js
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
export { Packr, Encoder, addExtension, pack, encode, NEVER, ALWAYS, DECIMAL_ROUND, DECIMAL_FIT } from './pack.js'
|
||||
export { Unpackr, Decoder, C1, unpack, unpackMultiple, decode, FLOAT32_OPTIONS, clearSource, roundFloat32, isNativeAccelerationEnabled } from './unpack.js'
|
||||
import './struct.js'
|
||||
export { PackrStream, UnpackrStream, PackrStream as EncoderStream, UnpackrStream as DecoderStream } from './stream.js'
|
||||
export { decodeIter, encodeIter } from './iterators.js'
|
||||
export const useRecords = false
|
||||
export const mapsAsObjects = true
|
||||
import { setExtractor } from './unpack.js'
|
||||
import { createRequire } from 'module'
|
||||
|
||||
const nativeAccelerationDisabled = process.env.MSGPACKR_NATIVE_ACCELERATION_DISABLED !== undefined && process.env.MSGPACKR_NATIVE_ACCELERATION_DISABLED.toLowerCase() === 'true';
|
||||
|
||||
if (!nativeAccelerationDisabled) {
|
||||
let extractor
|
||||
try {
|
||||
if (typeof require == 'function')
|
||||
extractor = require('msgpackr-extract')
|
||||
else
|
||||
extractor = createRequire(import.meta.url)('msgpackr-extract')
|
||||
if (extractor)
|
||||
setExtractor(extractor.extractStrings)
|
||||
} catch (error) {
|
||||
// native module is optional
|
||||
}
|
||||
}
|
||||
1
backend/node_modules/msgpackr/pack.d.cts
generated
vendored
Normal file
1
backend/node_modules/msgpackr/pack.d.cts
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
export { Unpackr, Decoder, Packr, Encoder, pack, encode, unpack, decode, addExtension, FLOAT32_OPTIONS } from '.'
|
||||
1
backend/node_modules/msgpackr/pack.d.ts
generated
vendored
Normal file
1
backend/node_modules/msgpackr/pack.d.ts
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
export { Unpackr, Decoder, Packr, Encoder, pack, encode, unpack, decode, addExtension, FLOAT32_OPTIONS } from '.'
|
||||
1137
backend/node_modules/msgpackr/pack.js
generated
vendored
Normal file
1137
backend/node_modules/msgpackr/pack.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
104
backend/node_modules/msgpackr/package.json
generated
vendored
Normal file
104
backend/node_modules/msgpackr/package.json
generated
vendored
Normal file
@@ -0,0 +1,104 @@
|
||||
{
|
||||
"name": "msgpackr",
|
||||
"author": "Kris Zyp",
|
||||
"version": "1.11.5",
|
||||
"description": "Ultra-fast MessagePack implementation with extensions for records and structured cloning",
|
||||
"license": "MIT",
|
||||
"types": "./index.d.ts",
|
||||
"main": "./dist/node.cjs",
|
||||
"module": "./index.js",
|
||||
"react-native": "./index.js",
|
||||
"keywords": [
|
||||
"MessagePack",
|
||||
"msgpack",
|
||||
"performance",
|
||||
"structured",
|
||||
"clone"
|
||||
],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "http://github.com/kriszyp/msgpackr"
|
||||
},
|
||||
"scripts": {
|
||||
"benchmark": "node ./tests/benchmark.cjs",
|
||||
"build": "rollup -c && cpy index.d.ts . --rename=index.d.cts && cpy pack.d.ts . --rename=pack.d.cts && cpy unpack.d.ts . --rename=unpack.d.cts",
|
||||
"dry-run": "npm publish --dry-run",
|
||||
"prepare": "npm run build",
|
||||
"test": "mocha tests/test**.*js -u tdd --experimental-json-modules"
|
||||
},
|
||||
"type": "module",
|
||||
"exports": {
|
||||
".": {
|
||||
"types": {
|
||||
"require": "./index.d.cts",
|
||||
"import": "./index.d.ts"
|
||||
},
|
||||
"browser": "./index.js",
|
||||
"node": {
|
||||
"require": "./dist/node.cjs",
|
||||
"import": "./node-index.js"
|
||||
},
|
||||
"bun": {
|
||||
"require": "./dist/node.cjs",
|
||||
"import": "./node-index.js"
|
||||
},
|
||||
"default": "./index.js"
|
||||
},
|
||||
"./pack": {
|
||||
"types": {
|
||||
"require": "./pack.d.cts",
|
||||
"import": "./pack.d.ts"
|
||||
},
|
||||
"browser": "./pack.js",
|
||||
"node": {
|
||||
"import": "./index.js",
|
||||
"require": "./dist/node.cjs"
|
||||
},
|
||||
"bun": {
|
||||
"import": "./index.js",
|
||||
"require": "./dist/node.cjs"
|
||||
},
|
||||
"default": "./pack.js"
|
||||
},
|
||||
"./unpack": {
|
||||
"types": {
|
||||
"require": "./unpack.d.cts",
|
||||
"import": "./unpack.d.ts"
|
||||
},
|
||||
"browser": "./unpack.js",
|
||||
"node": {
|
||||
"import": "./index.js",
|
||||
"require": "./dist/node.cjs"
|
||||
},
|
||||
"bun": {
|
||||
"import": "./index.js",
|
||||
"require": "./dist/node.cjs"
|
||||
},
|
||||
"default": "./unpack.js"
|
||||
},
|
||||
"./unpack-no-eval": "./dist/unpack-no-eval.cjs",
|
||||
"./index-no-eval": "./dist/index-no-eval.cjs"
|
||||
},
|
||||
"files": [
|
||||
"/dist",
|
||||
"*.md",
|
||||
"/*.js",
|
||||
"/*.ts",
|
||||
"/*.cts"
|
||||
],
|
||||
"optionalDependencies": {
|
||||
"msgpackr-extract": "^3.0.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@rollup/plugin-json": "^5.0.1",
|
||||
"@rollup/plugin-replace": "^5.0.1",
|
||||
"@types/node": "latest",
|
||||
"async": "^3",
|
||||
"chai": "^4.3.4",
|
||||
"cpy-cli": "^4.1.0",
|
||||
"esm": "^3.2.25",
|
||||
"mocha": "^10.1.0",
|
||||
"rollup": "^3.2.5",
|
||||
"@rollup/plugin-terser": "^0.1.0"
|
||||
}
|
||||
}
|
||||
88
backend/node_modules/msgpackr/rollup.config.js
generated
vendored
Normal file
88
backend/node_modules/msgpackr/rollup.config.js
generated
vendored
Normal file
@@ -0,0 +1,88 @@
|
||||
import terser from '@rollup/plugin-terser';
|
||||
import json from "@rollup/plugin-json";
|
||||
import replace from "@rollup/plugin-replace";
|
||||
|
||||
export default [
|
||||
{
|
||||
input: "node-index.js",
|
||||
output: [
|
||||
{
|
||||
file: "dist/node.cjs",
|
||||
format: "cjs",
|
||||
sourcemap: true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
input: "index.js",
|
||||
output: {
|
||||
file: "dist/index.js",
|
||||
format: "umd",
|
||||
name: "msgpackr",
|
||||
sourcemap: true
|
||||
}
|
||||
},
|
||||
{
|
||||
input: "index.js",
|
||||
plugins: [
|
||||
replace({ Function: 'BlockedFunction '})
|
||||
],
|
||||
output: {
|
||||
file: "dist/index-no-eval.cjs",
|
||||
format: "umd",
|
||||
name: "msgpackr",
|
||||
sourcemap: true
|
||||
},
|
||||
},
|
||||
{
|
||||
input: "unpack.js",
|
||||
plugins: [
|
||||
replace({ Function: 'BlockedFunction '})
|
||||
],
|
||||
output: {
|
||||
file: "dist/unpack-no-eval.cjs",
|
||||
format: "umd",
|
||||
name: "msgpackr",
|
||||
sourcemap: true
|
||||
},
|
||||
},
|
||||
{
|
||||
input: "index.js",
|
||||
plugins: [
|
||||
terser({})
|
||||
],
|
||||
output: {
|
||||
file: "dist/index.min.js",
|
||||
format: "umd",
|
||||
name: "msgpackr",
|
||||
sourcemap: true
|
||||
}
|
||||
},
|
||||
{
|
||||
input: "index.js",
|
||||
plugins: [
|
||||
replace({ Function: 'BlockedFunction '}),
|
||||
terser({})
|
||||
],
|
||||
output: {
|
||||
file: "dist/index-no-eval.min.js",
|
||||
format: "umd",
|
||||
name: "msgpackr",
|
||||
sourcemap: true
|
||||
}
|
||||
},
|
||||
{
|
||||
input: "tests/test.js",
|
||||
plugins: [json()],
|
||||
external: ['chai', '../index.js'],
|
||||
output: {
|
||||
file: "dist/test.js",
|
||||
format: "iife",
|
||||
sourcemap: true,
|
||||
globals: {
|
||||
chai: 'chai',
|
||||
'./index.js': 'msgpackr',
|
||||
},
|
||||
}
|
||||
}
|
||||
];
|
||||
57
backend/node_modules/msgpackr/stream.js
generated
vendored
Normal file
57
backend/node_modules/msgpackr/stream.js
generated
vendored
Normal file
@@ -0,0 +1,57 @@
|
||||
import { Transform } from 'stream'
|
||||
import { Packr } from './pack.js'
|
||||
import { Unpackr } from './unpack.js'
|
||||
var DEFAULT_OPTIONS = {objectMode: true}
|
||||
|
||||
export class PackrStream extends Transform {
|
||||
constructor(options) {
|
||||
if (!options)
|
||||
options = {}
|
||||
options.writableObjectMode = true
|
||||
super(options)
|
||||
options.sequential = true
|
||||
this.packr = options.packr || new Packr(options)
|
||||
}
|
||||
_transform(value, encoding, callback) {
|
||||
this.push(this.packr.pack(value))
|
||||
callback()
|
||||
}
|
||||
}
|
||||
|
||||
export class UnpackrStream extends Transform {
|
||||
constructor(options) {
|
||||
if (!options)
|
||||
options = {}
|
||||
options.objectMode = true
|
||||
super(options)
|
||||
options.structures = []
|
||||
this.unpackr = options.unpackr || new Unpackr(options)
|
||||
}
|
||||
_transform(chunk, encoding, callback) {
|
||||
if (this.incompleteBuffer) {
|
||||
chunk = Buffer.concat([this.incompleteBuffer, chunk])
|
||||
this.incompleteBuffer = null
|
||||
}
|
||||
let values
|
||||
try {
|
||||
values = this.unpackr.unpackMultiple(chunk)
|
||||
} catch(error) {
|
||||
if (error.incomplete) {
|
||||
this.incompleteBuffer = chunk.slice(error.lastPosition)
|
||||
values = error.values
|
||||
}
|
||||
else
|
||||
throw error
|
||||
} finally {
|
||||
for (let value of values || []) {
|
||||
if (value === null)
|
||||
value = this.getNullValue()
|
||||
this.push(value)
|
||||
}
|
||||
}
|
||||
if (callback) callback()
|
||||
}
|
||||
getNullValue() {
|
||||
return Symbol.for(null)
|
||||
}
|
||||
}
|
||||
815
backend/node_modules/msgpackr/struct.js
generated
vendored
Normal file
815
backend/node_modules/msgpackr/struct.js
generated
vendored
Normal file
@@ -0,0 +1,815 @@
|
||||
|
||||
/*
|
||||
|
||||
For "any-data":
|
||||
32-55 - record with record ids (-32)
|
||||
56 - 8-bit record ids
|
||||
57 - 16-bit record ids
|
||||
58 - 24-bit record ids
|
||||
59 - 32-bit record ids
|
||||
250-255 - followed by typed fixed width values
|
||||
64-250 msgpackr/cbor/paired data
|
||||
arrays and strings within arrays are handled by paired encoding
|
||||
|
||||
Structure encoding:
|
||||
(type - string (using paired encoding))+
|
||||
|
||||
Type encoding
|
||||
encoding byte - fixed width byte - next reference+
|
||||
|
||||
Encoding byte:
|
||||
first bit:
|
||||
0 - inline
|
||||
1 - reference
|
||||
second bit:
|
||||
0 - data or number
|
||||
1 - string
|
||||
|
||||
remaining bits:
|
||||
character encoding - ISO-8859-x
|
||||
|
||||
|
||||
null (0xff)+ 0xf6
|
||||
null (0xff)+ 0xf7
|
||||
|
||||
*/
|
||||
|
||||
|
||||
import {setWriteStructSlots, RECORD_SYMBOL, addExtension} from './pack.js'
|
||||
import {setReadStruct, mult10, readString} from './unpack.js';
|
||||
const ASCII = 3; // the MIBenum from https://www.iana.org/assignments/character-sets/character-sets.xhtml (and other character encodings could be referenced by MIBenum)
|
||||
const NUMBER = 0;
|
||||
const UTF8 = 2;
|
||||
const OBJECT_DATA = 1;
|
||||
const DATE = 16;
|
||||
const TYPE_NAMES = ['num', 'object', 'string', 'ascii'];
|
||||
TYPE_NAMES[DATE] = 'date';
|
||||
const float32Headers = [false, true, true, false, false, true, true, false];
|
||||
let evalSupported;
|
||||
try {
|
||||
new Function('');
|
||||
evalSupported = true;
|
||||
} catch(error) {
|
||||
// if eval variants are not supported, do not create inline object readers ever
|
||||
}
|
||||
|
||||
let updatedPosition;
|
||||
const hasNodeBuffer = typeof Buffer !== 'undefined'
|
||||
let textEncoder, currentSource;
|
||||
try {
|
||||
textEncoder = new TextEncoder()
|
||||
} catch (error) {}
|
||||
const encodeUtf8 = hasNodeBuffer ? function(target, string, position) {
|
||||
return target.utf8Write(string, position, target.byteLength - position)
|
||||
} : (textEncoder && textEncoder.encodeInto) ?
|
||||
function(target, string, position) {
|
||||
return textEncoder.encodeInto(string, target.subarray(position)).written
|
||||
} : false
|
||||
|
||||
const TYPE = Symbol('type');
|
||||
const PARENT = Symbol('parent');
|
||||
setWriteStructSlots(writeStruct, prepareStructures);
|
||||
function writeStruct(object, target, encodingStart, position, structures, makeRoom, pack, packr) {
|
||||
let typedStructs = packr.typedStructs || (packr.typedStructs = []);
|
||||
// note that we rely on pack.js to load stored structures before we get to this point
|
||||
let targetView = target.dataView;
|
||||
let refsStartPosition = (typedStructs.lastStringStart || 100) + position;
|
||||
let safeEnd = target.length - 10;
|
||||
let start = position;
|
||||
if (position > safeEnd) {
|
||||
target = makeRoom(position);
|
||||
targetView = target.dataView;
|
||||
position -= encodingStart;
|
||||
start -= encodingStart;
|
||||
refsStartPosition -= encodingStart;
|
||||
encodingStart = 0;
|
||||
safeEnd = target.length - 10;
|
||||
}
|
||||
|
||||
let refOffset, refPosition = refsStartPosition;
|
||||
|
||||
let transition = typedStructs.transitions || (typedStructs.transitions = Object.create(null));
|
||||
let nextId = typedStructs.nextId || typedStructs.length;
|
||||
let headerSize =
|
||||
nextId < 0xf ? 1 :
|
||||
nextId < 0xf0 ? 2 :
|
||||
nextId < 0xf000 ? 3 :
|
||||
nextId < 0xf00000 ? 4 : 0;
|
||||
if (headerSize === 0)
|
||||
return 0;
|
||||
position += headerSize;
|
||||
let queuedReferences = [];
|
||||
let usedAscii0;
|
||||
let keyIndex = 0;
|
||||
for (let key in object) {
|
||||
let value = object[key];
|
||||
let nextTransition = transition[key];
|
||||
if (!nextTransition) {
|
||||
transition[key] = nextTransition = {
|
||||
key,
|
||||
parent: transition,
|
||||
enumerationOffset: 0,
|
||||
ascii0: null,
|
||||
ascii8: null,
|
||||
num8: null,
|
||||
string16: null,
|
||||
object16: null,
|
||||
num32: null,
|
||||
float64: null,
|
||||
date64: null
|
||||
};
|
||||
}
|
||||
if (position > safeEnd) {
|
||||
target = makeRoom(position);
|
||||
targetView = target.dataView;
|
||||
position -= encodingStart;
|
||||
start -= encodingStart;
|
||||
refsStartPosition -= encodingStart;
|
||||
refPosition -= encodingStart;
|
||||
encodingStart = 0;
|
||||
safeEnd = target.length - 10
|
||||
}
|
||||
switch (typeof value) {
|
||||
case 'number':
|
||||
let number = value;
|
||||
// first check to see if we are using a lot of ids and should default to wide/common format
|
||||
if (nextId < 200 || !nextTransition.num64) {
|
||||
if (number >> 0 === number && number < 0x20000000 && number > -0x1f000000) {
|
||||
if (number < 0xf6 && number >= 0 && (nextTransition.num8 && !(nextId > 200 && nextTransition.num32) || number < 0x20 && !nextTransition.num32)) {
|
||||
transition = nextTransition.num8 || createTypeTransition(nextTransition, NUMBER, 1);
|
||||
target[position++] = number;
|
||||
} else {
|
||||
transition = nextTransition.num32 || createTypeTransition(nextTransition, NUMBER, 4);
|
||||
targetView.setUint32(position, number, true);
|
||||
position += 4;
|
||||
}
|
||||
break;
|
||||
} else if (number < 0x100000000 && number >= -0x80000000) {
|
||||
targetView.setFloat32(position, number, true);
|
||||
if (float32Headers[target[position + 3] >>> 5]) {
|
||||
let xShifted
|
||||
// this checks for rounding of numbers that were encoded in 32-bit float to nearest significant decimal digit that could be preserved
|
||||
if (((xShifted = number * mult10[((target[position + 3] & 0x7f) << 1) | (target[position + 2] >> 7)]) >> 0) === xShifted) {
|
||||
transition = nextTransition.num32 || createTypeTransition(nextTransition, NUMBER, 4);
|
||||
position += 4;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
transition = nextTransition.num64 || createTypeTransition(nextTransition, NUMBER, 8);
|
||||
targetView.setFloat64(position, number, true);
|
||||
position += 8;
|
||||
break;
|
||||
case 'string':
|
||||
let strLength = value.length;
|
||||
refOffset = refPosition - refsStartPosition;
|
||||
if ((strLength << 2) + refPosition > safeEnd) {
|
||||
target = makeRoom((strLength << 2) + refPosition);
|
||||
targetView = target.dataView;
|
||||
position -= encodingStart;
|
||||
start -= encodingStart;
|
||||
refsStartPosition -= encodingStart;
|
||||
refPosition -= encodingStart;
|
||||
encodingStart = 0;
|
||||
safeEnd = target.length - 10
|
||||
}
|
||||
if (strLength > ((0xff00 + refOffset) >> 2)) {
|
||||
queuedReferences.push(key, value, position - start);
|
||||
break;
|
||||
}
|
||||
let isNotAscii
|
||||
let strStart = refPosition;
|
||||
if (strLength < 0x40) {
|
||||
let i, c1, c2;
|
||||
for (i = 0; i < strLength; i++) {
|
||||
c1 = value.charCodeAt(i)
|
||||
if (c1 < 0x80) {
|
||||
target[refPosition++] = c1
|
||||
} else if (c1 < 0x800) {
|
||||
isNotAscii = true;
|
||||
target[refPosition++] = c1 >> 6 | 0xc0
|
||||
target[refPosition++] = c1 & 0x3f | 0x80
|
||||
} else if (
|
||||
(c1 & 0xfc00) === 0xd800 &&
|
||||
((c2 = value.charCodeAt(i + 1)) & 0xfc00) === 0xdc00
|
||||
) {
|
||||
isNotAscii = true;
|
||||
c1 = 0x10000 + ((c1 & 0x03ff) << 10) + (c2 & 0x03ff)
|
||||
i++
|
||||
target[refPosition++] = c1 >> 18 | 0xf0
|
||||
target[refPosition++] = c1 >> 12 & 0x3f | 0x80
|
||||
target[refPosition++] = c1 >> 6 & 0x3f | 0x80
|
||||
target[refPosition++] = c1 & 0x3f | 0x80
|
||||
} else {
|
||||
isNotAscii = true;
|
||||
target[refPosition++] = c1 >> 12 | 0xe0
|
||||
target[refPosition++] = c1 >> 6 & 0x3f | 0x80
|
||||
target[refPosition++] = c1 & 0x3f | 0x80
|
||||
}
|
||||
}
|
||||
} else {
|
||||
refPosition += encodeUtf8(target, value, refPosition);
|
||||
isNotAscii = refPosition - strStart > strLength;
|
||||
}
|
||||
if (refOffset < 0xa0 || (refOffset < 0xf6 && (nextTransition.ascii8 || nextTransition.string8))) {
|
||||
// short strings
|
||||
if (isNotAscii) {
|
||||
if (!(transition = nextTransition.string8)) {
|
||||
if (typedStructs.length > 10 && (transition = nextTransition.ascii8)) {
|
||||
// we can safely change ascii to utf8 in place since they are compatible
|
||||
transition.__type = UTF8;
|
||||
nextTransition.ascii8 = null;
|
||||
nextTransition.string8 = transition;
|
||||
pack(null, 0, true); // special call to notify that structures have been updated
|
||||
} else {
|
||||
transition = createTypeTransition(nextTransition, UTF8, 1);
|
||||
}
|
||||
}
|
||||
} else if (refOffset === 0 && !usedAscii0) {
|
||||
usedAscii0 = true;
|
||||
transition = nextTransition.ascii0 || createTypeTransition(nextTransition, ASCII, 0);
|
||||
break; // don't increment position
|
||||
}// else ascii:
|
||||
else if (!(transition = nextTransition.ascii8) && !(typedStructs.length > 10 && (transition = nextTransition.string8)))
|
||||
transition = createTypeTransition(nextTransition, ASCII, 1);
|
||||
target[position++] = refOffset;
|
||||
} else {
|
||||
// TODO: Enable ascii16 at some point, but get the logic right
|
||||
//if (isNotAscii)
|
||||
transition = nextTransition.string16 || createTypeTransition(nextTransition, UTF8, 2);
|
||||
//else
|
||||
//transition = nextTransition.ascii16 || createTypeTransition(nextTransition, ASCII, 2);
|
||||
targetView.setUint16(position, refOffset, true);
|
||||
position += 2;
|
||||
}
|
||||
break;
|
||||
case 'object':
|
||||
if (value) {
|
||||
if (value.constructor === Date) {
|
||||
transition = nextTransition.date64 || createTypeTransition(nextTransition, DATE, 8);
|
||||
targetView.setFloat64(position, value.getTime(), true);
|
||||
position += 8;
|
||||
} else {
|
||||
queuedReferences.push(key, value, keyIndex);
|
||||
}
|
||||
break;
|
||||
} else { // null
|
||||
nextTransition = anyType(nextTransition, position, targetView, -10); // match CBOR with this
|
||||
if (nextTransition) {
|
||||
transition = nextTransition;
|
||||
position = updatedPosition;
|
||||
} else queuedReferences.push(key, value, keyIndex);
|
||||
}
|
||||
break;
|
||||
case 'boolean':
|
||||
transition = nextTransition.num8 || nextTransition.ascii8 || createTypeTransition(nextTransition, NUMBER, 1);
|
||||
target[position++] = value ? 0xf9 : 0xf8; // match CBOR with these
|
||||
break;
|
||||
case 'undefined':
|
||||
nextTransition = anyType(nextTransition, position, targetView, -9); // match CBOR with this
|
||||
if (nextTransition) {
|
||||
transition = nextTransition;
|
||||
position = updatedPosition;
|
||||
} else queuedReferences.push(key, value, keyIndex);
|
||||
break;
|
||||
default:
|
||||
queuedReferences.push(key, value, keyIndex);
|
||||
}
|
||||
keyIndex++;
|
||||
}
|
||||
|
||||
for (let i = 0, l = queuedReferences.length; i < l;) {
|
||||
let key = queuedReferences[i++];
|
||||
let value = queuedReferences[i++];
|
||||
let propertyIndex = queuedReferences[i++];
|
||||
let nextTransition = transition[key];
|
||||
if (!nextTransition) {
|
||||
transition[key] = nextTransition = {
|
||||
key,
|
||||
parent: transition,
|
||||
enumerationOffset: propertyIndex - keyIndex,
|
||||
ascii0: null,
|
||||
ascii8: null,
|
||||
num8: null,
|
||||
string16: null,
|
||||
object16: null,
|
||||
num32: null,
|
||||
float64: null
|
||||
};
|
||||
}
|
||||
let newPosition;
|
||||
if (value) {
|
||||
/*if (typeof value === 'string') { // TODO: we could re-enable long strings
|
||||
if (position + value.length * 3 > safeEnd) {
|
||||
target = makeRoom(position + value.length * 3);
|
||||
position -= start;
|
||||
targetView = target.dataView;
|
||||
start = 0;
|
||||
}
|
||||
newPosition = position + target.utf8Write(value, position, 0xffffffff);
|
||||
} else { */
|
||||
let size;
|
||||
refOffset = refPosition - refsStartPosition;
|
||||
if (refOffset < 0xff00) {
|
||||
transition = nextTransition.object16;
|
||||
if (transition)
|
||||
size = 2;
|
||||
else if ((transition = nextTransition.object32))
|
||||
size = 4;
|
||||
else {
|
||||
transition = createTypeTransition(nextTransition, OBJECT_DATA, 2);
|
||||
size = 2;
|
||||
}
|
||||
} else {
|
||||
transition = nextTransition.object32 || createTypeTransition(nextTransition, OBJECT_DATA, 4);
|
||||
size = 4;
|
||||
}
|
||||
newPosition = pack(value, refPosition);
|
||||
//}
|
||||
if (typeof newPosition === 'object') {
|
||||
// re-allocated
|
||||
refPosition = newPosition.position;
|
||||
targetView = newPosition.targetView;
|
||||
target = newPosition.target;
|
||||
refsStartPosition -= encodingStart;
|
||||
position -= encodingStart;
|
||||
start -= encodingStart;
|
||||
encodingStart = 0;
|
||||
} else
|
||||
refPosition = newPosition;
|
||||
if (size === 2) {
|
||||
targetView.setUint16(position, refOffset, true);
|
||||
position += 2;
|
||||
} else {
|
||||
targetView.setUint32(position, refOffset, true);
|
||||
position += 4;
|
||||
}
|
||||
} else { // null or undefined
|
||||
transition = nextTransition.object16 || createTypeTransition(nextTransition, OBJECT_DATA, 2);
|
||||
targetView.setInt16(position, value === null ? -10 : -9, true);
|
||||
position += 2;
|
||||
}
|
||||
keyIndex++;
|
||||
}
|
||||
|
||||
|
||||
let recordId = transition[RECORD_SYMBOL];
|
||||
if (recordId == null) {
|
||||
recordId = packr.typedStructs.length;
|
||||
let structure = [];
|
||||
let nextTransition = transition;
|
||||
let key, type;
|
||||
while ((type = nextTransition.__type) !== undefined) {
|
||||
let size = nextTransition.__size;
|
||||
nextTransition = nextTransition.__parent;
|
||||
key = nextTransition.key;
|
||||
let property = [type, size, key];
|
||||
if (nextTransition.enumerationOffset)
|
||||
property.push(nextTransition.enumerationOffset);
|
||||
structure.push(property);
|
||||
nextTransition = nextTransition.parent;
|
||||
}
|
||||
structure.reverse();
|
||||
transition[RECORD_SYMBOL] = recordId;
|
||||
packr.typedStructs[recordId] = structure;
|
||||
pack(null, 0, true); // special call to notify that structures have been updated
|
||||
}
|
||||
|
||||
|
||||
switch (headerSize) {
|
||||
case 1:
|
||||
if (recordId >= 0x10) return 0;
|
||||
target[start] = recordId + 0x20;
|
||||
break;
|
||||
case 2:
|
||||
if (recordId >= 0x100) return 0;
|
||||
target[start] = 0x38;
|
||||
target[start + 1] = recordId;
|
||||
break;
|
||||
case 3:
|
||||
if (recordId >= 0x10000) return 0;
|
||||
target[start] = 0x39;
|
||||
targetView.setUint16(start + 1, recordId, true);
|
||||
break;
|
||||
case 4:
|
||||
if (recordId >= 0x1000000) return 0;
|
||||
targetView.setUint32(start, (recordId << 8) + 0x3a, true);
|
||||
break;
|
||||
}
|
||||
|
||||
if (position < refsStartPosition) {
|
||||
if (refsStartPosition === refPosition)
|
||||
return position; // no refs
|
||||
// adjust positioning
|
||||
target.copyWithin(position, refsStartPosition, refPosition);
|
||||
refPosition += position - refsStartPosition;
|
||||
typedStructs.lastStringStart = position - start;
|
||||
} else if (position > refsStartPosition) {
|
||||
if (refsStartPosition === refPosition)
|
||||
return position; // no refs
|
||||
typedStructs.lastStringStart = position - start;
|
||||
return writeStruct(object, target, encodingStart, start, structures, makeRoom, pack, packr);
|
||||
}
|
||||
return refPosition;
|
||||
}
|
||||
function anyType(transition, position, targetView, value) {
|
||||
let nextTransition;
|
||||
if ((nextTransition = transition.ascii8 || transition.num8)) {
|
||||
targetView.setInt8(position, value, true);
|
||||
updatedPosition = position + 1;
|
||||
return nextTransition;
|
||||
}
|
||||
if ((nextTransition = transition.string16 || transition.object16)) {
|
||||
targetView.setInt16(position, value, true);
|
||||
updatedPosition = position + 2;
|
||||
return nextTransition;
|
||||
}
|
||||
if (nextTransition = transition.num32) {
|
||||
targetView.setUint32(position, 0xe0000100 + value, true);
|
||||
updatedPosition = position + 4;
|
||||
return nextTransition;
|
||||
}
|
||||
// transition.float64
|
||||
if (nextTransition = transition.num64) {
|
||||
targetView.setFloat64(position, NaN, true);
|
||||
targetView.setInt8(position, value);
|
||||
updatedPosition = position + 8;
|
||||
return nextTransition;
|
||||
}
|
||||
updatedPosition = position;
|
||||
// TODO: can we do an "any" type where we defer the decision?
|
||||
return;
|
||||
}
|
||||
function createTypeTransition(transition, type, size) {
|
||||
let typeName = TYPE_NAMES[type] + (size << 3);
|
||||
let newTransition = transition[typeName] || (transition[typeName] = Object.create(null));
|
||||
newTransition.__type = type;
|
||||
newTransition.__size = size;
|
||||
newTransition.__parent = transition;
|
||||
return newTransition;
|
||||
}
|
||||
function onLoadedStructures(sharedData) {
|
||||
if (!(sharedData instanceof Map))
|
||||
return sharedData;
|
||||
let typed = sharedData.get('typed') || [];
|
||||
if (Object.isFrozen(typed))
|
||||
typed = typed.map(structure => structure.slice(0));
|
||||
let named = sharedData.get('named');
|
||||
let transitions = Object.create(null);
|
||||
for (let i = 0, l = typed.length; i < l; i++) {
|
||||
let structure = typed[i];
|
||||
let transition = transitions;
|
||||
for (let [type, size, key] of structure) {
|
||||
let nextTransition = transition[key];
|
||||
if (!nextTransition) {
|
||||
transition[key] = nextTransition = {
|
||||
key,
|
||||
parent: transition,
|
||||
enumerationOffset: 0,
|
||||
ascii0: null,
|
||||
ascii8: null,
|
||||
num8: null,
|
||||
string16: null,
|
||||
object16: null,
|
||||
num32: null,
|
||||
float64: null,
|
||||
date64: null,
|
||||
};
|
||||
}
|
||||
transition = createTypeTransition(nextTransition, type, size);
|
||||
}
|
||||
transition[RECORD_SYMBOL] = i;
|
||||
}
|
||||
typed.transitions = transitions;
|
||||
this.typedStructs = typed;
|
||||
this.lastTypedStructuresLength = typed.length;
|
||||
return named;
|
||||
}
|
||||
var sourceSymbol = Symbol.for('source')
|
||||
function readStruct(src, position, srcEnd, unpackr) {
|
||||
let recordId = src[position++] - 0x20;
|
||||
if (recordId >= 24) {
|
||||
switch(recordId) {
|
||||
case 24: recordId = src[position++]; break;
|
||||
// little endian:
|
||||
case 25: recordId = src[position++] + (src[position++] << 8); break;
|
||||
case 26: recordId = src[position++] + (src[position++] << 8) + (src[position++] << 16); break;
|
||||
case 27: recordId = src[position++] + (src[position++] << 8) + (src[position++] << 16) + (src[position++] << 24); break;
|
||||
}
|
||||
}
|
||||
let structure = unpackr.typedStructs && unpackr.typedStructs[recordId];
|
||||
if (!structure) {
|
||||
// copy src buffer because getStructures will override it
|
||||
src = Uint8Array.prototype.slice.call(src, position, srcEnd);
|
||||
srcEnd -= position;
|
||||
position = 0;
|
||||
if (!unpackr.getStructures)
|
||||
throw new Error(`Reference to shared structure ${recordId} without getStructures method`);
|
||||
unpackr._mergeStructures(unpackr.getStructures());
|
||||
if (!unpackr.typedStructs)
|
||||
throw new Error('Could not find any shared typed structures');
|
||||
unpackr.lastTypedStructuresLength = unpackr.typedStructs.length;
|
||||
structure = unpackr.typedStructs[recordId];
|
||||
if (!structure)
|
||||
throw new Error('Could not find typed structure ' + recordId);
|
||||
}
|
||||
var construct = structure.construct;
|
||||
var fullConstruct = structure.fullConstruct;
|
||||
if (!construct) {
|
||||
construct = structure.construct = function LazyObject() {
|
||||
}
|
||||
fullConstruct = structure.fullConstruct = function LoadedObject() {
|
||||
}
|
||||
fullConstruct.prototype = unpackr.structPrototype || {};
|
||||
var prototype = construct.prototype = unpackr.structPrototype ? Object.create(unpackr.structPrototype) : {};
|
||||
let properties = [];
|
||||
let currentOffset = 0;
|
||||
let lastRefProperty;
|
||||
for (let i = 0, l = structure.length; i < l; i++) {
|
||||
let definition = structure[i];
|
||||
let [ type, size, key, enumerationOffset ] = definition;
|
||||
if (key === '__proto__')
|
||||
key = '__proto_';
|
||||
let property = {
|
||||
key,
|
||||
offset: currentOffset,
|
||||
}
|
||||
if (enumerationOffset)
|
||||
properties.splice(i + enumerationOffset, 0, property);
|
||||
else
|
||||
properties.push(property);
|
||||
let getRef;
|
||||
switch(size) { // TODO: Move into a separate function
|
||||
case 0: getRef = () => 0; break;
|
||||
case 1:
|
||||
getRef = (source, position) => {
|
||||
let ref = source.bytes[position + property.offset];
|
||||
return ref >= 0xf6 ? toConstant(ref) : ref;
|
||||
};
|
||||
break;
|
||||
case 2:
|
||||
getRef = (source, position) => {
|
||||
let src = source.bytes;
|
||||
let dataView = src.dataView || (src.dataView = new DataView(src.buffer, src.byteOffset, src.byteLength));
|
||||
let ref = dataView.getUint16(position + property.offset, true);
|
||||
return ref >= 0xff00 ? toConstant(ref & 0xff) : ref;
|
||||
};
|
||||
break;
|
||||
case 4:
|
||||
getRef = (source, position) => {
|
||||
let src = source.bytes;
|
||||
let dataView = src.dataView || (src.dataView = new DataView(src.buffer, src.byteOffset, src.byteLength));
|
||||
let ref = dataView.getUint32(position + property.offset, true);
|
||||
return ref >= 0xffffff00 ? toConstant(ref & 0xff) : ref;
|
||||
};
|
||||
break;
|
||||
}
|
||||
property.getRef = getRef;
|
||||
currentOffset += size;
|
||||
let get;
|
||||
switch(type) {
|
||||
case ASCII:
|
||||
if (lastRefProperty && !lastRefProperty.next)
|
||||
lastRefProperty.next = property;
|
||||
lastRefProperty = property;
|
||||
property.multiGetCount = 0;
|
||||
get = function(source) {
|
||||
let src = source.bytes;
|
||||
let position = source.position;
|
||||
let refStart = currentOffset + position;
|
||||
let ref = getRef(source, position);
|
||||
if (typeof ref !== 'number') return ref;
|
||||
|
||||
let end, next = property.next;
|
||||
while(next) {
|
||||
end = next.getRef(source, position);
|
||||
if (typeof end === 'number')
|
||||
break;
|
||||
else
|
||||
end = null;
|
||||
next = next.next;
|
||||
}
|
||||
if (end == null)
|
||||
end = source.bytesEnd - refStart;
|
||||
if (source.srcString) {
|
||||
return source.srcString.slice(ref, end);
|
||||
}
|
||||
/*if (property.multiGetCount > 0) {
|
||||
let asciiEnd;
|
||||
next = firstRefProperty;
|
||||
let dataView = src.dataView || (src.dataView = new DataView(src.buffer, src.byteOffset, src.byteLength));
|
||||
do {
|
||||
asciiEnd = dataView.getUint16(source.position + next.offset, true);
|
||||
if (asciiEnd < 0xff00)
|
||||
break;
|
||||
else
|
||||
asciiEnd = null;
|
||||
} while((next = next.next));
|
||||
if (asciiEnd == null)
|
||||
asciiEnd = source.bytesEnd - refStart
|
||||
source.srcString = src.toString('latin1', refStart, refStart + asciiEnd);
|
||||
return source.srcString.slice(ref, end);
|
||||
}
|
||||
if (source.prevStringGet) {
|
||||
source.prevStringGet.multiGetCount += 2;
|
||||
} else {
|
||||
source.prevStringGet = property;
|
||||
property.multiGetCount--;
|
||||
}*/
|
||||
return readString(src, ref + refStart, end - ref);
|
||||
//return src.toString('latin1', ref + refStart, end + refStart);
|
||||
};
|
||||
break;
|
||||
case UTF8: case OBJECT_DATA:
|
||||
if (lastRefProperty && !lastRefProperty.next)
|
||||
lastRefProperty.next = property;
|
||||
lastRefProperty = property;
|
||||
get = function(source) {
|
||||
let position = source.position;
|
||||
let refStart = currentOffset + position;
|
||||
let ref = getRef(source, position);
|
||||
if (typeof ref !== 'number') return ref;
|
||||
let src = source.bytes;
|
||||
let end, next = property.next;
|
||||
while(next) {
|
||||
end = next.getRef(source, position);
|
||||
if (typeof end === 'number')
|
||||
break;
|
||||
else
|
||||
end = null;
|
||||
next = next.next;
|
||||
}
|
||||
if (end == null)
|
||||
end = source.bytesEnd - refStart;
|
||||
if (type === UTF8) {
|
||||
return src.toString('utf8', ref + refStart, end + refStart);
|
||||
} else {
|
||||
currentSource = source;
|
||||
try {
|
||||
return unpackr.unpack(src, { start: ref + refStart, end: end + refStart });
|
||||
} finally {
|
||||
currentSource = null;
|
||||
}
|
||||
}
|
||||
};
|
||||
break;
|
||||
case NUMBER:
|
||||
switch(size) {
|
||||
case 4:
|
||||
get = function (source) {
|
||||
let src = source.bytes;
|
||||
let dataView = src.dataView || (src.dataView = new DataView(src.buffer, src.byteOffset, src.byteLength));
|
||||
let position = source.position + property.offset;
|
||||
let value = dataView.getInt32(position, true)
|
||||
if (value < 0x20000000) {
|
||||
if (value > -0x1f000000)
|
||||
return value;
|
||||
if (value > -0x20000000)
|
||||
return toConstant(value & 0xff);
|
||||
}
|
||||
let fValue = dataView.getFloat32(position, true);
|
||||
// this does rounding of numbers that were encoded in 32-bit float to nearest significant decimal digit that could be preserved
|
||||
let multiplier = mult10[((src[position + 3] & 0x7f) << 1) | (src[position + 2] >> 7)]
|
||||
return ((multiplier * fValue + (fValue > 0 ? 0.5 : -0.5)) >> 0) / multiplier;
|
||||
};
|
||||
break;
|
||||
case 8:
|
||||
get = function (source) {
|
||||
let src = source.bytes;
|
||||
let dataView = src.dataView || (src.dataView = new DataView(src.buffer, src.byteOffset, src.byteLength));
|
||||
let value = dataView.getFloat64(source.position + property.offset, true);
|
||||
if (isNaN(value)) {
|
||||
let byte = src[source.position + property.offset];
|
||||
if (byte >= 0xf6)
|
||||
return toConstant(byte);
|
||||
}
|
||||
return value;
|
||||
};
|
||||
break;
|
||||
case 1:
|
||||
get = function (source) {
|
||||
let src = source.bytes;
|
||||
let value = src[source.position + property.offset];
|
||||
return value < 0xf6 ? value : toConstant(value);
|
||||
};
|
||||
break;
|
||||
}
|
||||
break;
|
||||
case DATE:
|
||||
get = function (source) {
|
||||
let src = source.bytes;
|
||||
let dataView = src.dataView || (src.dataView = new DataView(src.buffer, src.byteOffset, src.byteLength));
|
||||
return new Date(dataView.getFloat64(source.position + property.offset, true));
|
||||
};
|
||||
break;
|
||||
|
||||
}
|
||||
property.get = get;
|
||||
}
|
||||
// TODO: load the srcString for faster string decoding on toJSON
|
||||
if (evalSupported) {
|
||||
let objectLiteralProperties = [];
|
||||
let args = [];
|
||||
let i = 0;
|
||||
let hasInheritedProperties;
|
||||
for (let property of properties) { // assign in enumeration order
|
||||
if (unpackr.alwaysLazyProperty && unpackr.alwaysLazyProperty(property.key)) {
|
||||
// these properties are not eagerly evaluated and this can be used for creating properties
|
||||
// that are not serialized as JSON
|
||||
hasInheritedProperties = true;
|
||||
continue;
|
||||
}
|
||||
Object.defineProperty(prototype, property.key, { get: withSource(property.get), enumerable: true });
|
||||
let valueFunction = 'v' + i++;
|
||||
args.push(valueFunction);
|
||||
objectLiteralProperties.push('o[' + JSON.stringify(property.key) + ']=' + valueFunction + '(s)');
|
||||
}
|
||||
if (hasInheritedProperties) {
|
||||
objectLiteralProperties.push('__proto__:this');
|
||||
}
|
||||
let toObject = (new Function(...args, 'var c=this;return function(s){var o=new c();' + objectLiteralProperties.join(';') + ';return o;}')).apply(fullConstruct, properties.map(prop => prop.get));
|
||||
Object.defineProperty(prototype, 'toJSON', {
|
||||
value(omitUnderscoredProperties) {
|
||||
return toObject.call(this, this[sourceSymbol]);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
Object.defineProperty(prototype, 'toJSON', {
|
||||
value(omitUnderscoredProperties) {
|
||||
// return an enumerable object with own properties to JSON stringify
|
||||
let resolved = {};
|
||||
for (let i = 0, l = properties.length; i < l; i++) {
|
||||
// TODO: check alwaysLazyProperty
|
||||
let key = properties[i].key;
|
||||
|
||||
resolved[key] = this[key];
|
||||
}
|
||||
return resolved;
|
||||
},
|
||||
// not enumerable or anything
|
||||
});
|
||||
}
|
||||
}
|
||||
var instance = new construct();
|
||||
instance[sourceSymbol] = {
|
||||
bytes: src,
|
||||
position,
|
||||
srcString: '',
|
||||
bytesEnd: srcEnd
|
||||
}
|
||||
return instance;
|
||||
}
|
||||
function toConstant(code) {
|
||||
switch(code) {
|
||||
case 0xf6: return null;
|
||||
case 0xf7: return undefined;
|
||||
case 0xf8: return false;
|
||||
case 0xf9: return true;
|
||||
}
|
||||
throw new Error('Unknown constant');
|
||||
}
|
||||
function withSource(get) {
|
||||
return function() {
|
||||
return get(this[sourceSymbol]);
|
||||
}
|
||||
}
|
||||
|
||||
function saveState() {
|
||||
if (currentSource) {
|
||||
currentSource.bytes = Uint8Array.prototype.slice.call(currentSource.bytes, currentSource.position, currentSource.bytesEnd);
|
||||
currentSource.position = 0;
|
||||
currentSource.bytesEnd = currentSource.bytes.length;
|
||||
}
|
||||
}
|
||||
function prepareStructures(structures, packr) {
|
||||
if (packr.typedStructs) {
|
||||
let structMap = new Map();
|
||||
structMap.set('named', structures);
|
||||
structMap.set('typed', packr.typedStructs);
|
||||
structures = structMap;
|
||||
}
|
||||
let lastTypedStructuresLength = packr.lastTypedStructuresLength || 0;
|
||||
structures.isCompatible = existing => {
|
||||
let compatible = true;
|
||||
if (existing instanceof Map) {
|
||||
let named = existing.get('named') || [];
|
||||
if (named.length !== (packr.lastNamedStructuresLength || 0))
|
||||
compatible = false;
|
||||
let typed = existing.get('typed') || [];
|
||||
if (typed.length !== lastTypedStructuresLength)
|
||||
compatible = false;
|
||||
} else if (existing instanceof Array || Array.isArray(existing)) {
|
||||
if (existing.length !== (packr.lastNamedStructuresLength || 0))
|
||||
compatible = false;
|
||||
}
|
||||
if (!compatible)
|
||||
packr._mergeStructures(existing);
|
||||
return compatible;
|
||||
};
|
||||
packr.lastTypedStructuresLength = packr.typedStructs && packr.typedStructs.length;
|
||||
return structures;
|
||||
}
|
||||
|
||||
setReadStruct(readStruct, onLoadedStructures, saveState);
|
||||
|
||||
3
backend/node_modules/msgpackr/test-worker.js
generated
vendored
Normal file
3
backend/node_modules/msgpackr/test-worker.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
setTimeout(() => {
|
||||
console.log('done');
|
||||
}, 10000);
|
||||
2
backend/node_modules/msgpackr/unpack.d.cts
generated
vendored
Normal file
2
backend/node_modules/msgpackr/unpack.d.cts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
export { Unpackr, Decoder, unpack, unpackMultiple, decode,
|
||||
addExtension, FLOAT32_OPTIONS, Options, Extension, clearSource, roundFloat32 } from '.'
|
||||
2
backend/node_modules/msgpackr/unpack.d.ts
generated
vendored
Normal file
2
backend/node_modules/msgpackr/unpack.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
export { Unpackr, Decoder, unpack, unpackMultiple, decode,
|
||||
addExtension, FLOAT32_OPTIONS, Options, Extension, clearSource, roundFloat32 } from '.'
|
||||
1221
backend/node_modules/msgpackr/unpack.js
generated
vendored
Normal file
1221
backend/node_modules/msgpackr/unpack.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user