@@ -9,28 +9,29 @@ Scala macros that generate codecs for case classes, standard types and collectio
9
9
to get maximum performance of JSON parsing and serialization.
10
10
11
11
[ Latest results of benchmarks] ( http://plokhotnyuk.github.io/jsoniter-scala/ ) which compare parsing and serialization
12
- performance of Jsoniter Scala vs. Jackson, Circe and Play-JSON libraries using JDK 8 & JDK 9 on the following
13
- environment: Intel® Core™ i7-7700HQ CPU @ 2.8GHz (max 3.8GHz), RAM 16Gb DDR4-2400, Ubuntu 16.04,
14
- Linux notebook 4.13.0-32-generic, Oracle JDK 64-bit (builds 1.8.0_161-b12 and 9.0.4+11 accordingly)
12
+ performance of Jsoniter Scala vs. Jackson, Circe and Play-JSON libraries using JDK 8 & 9 on the following environment:
13
+ Intel® Core™ i7-7700HQ CPU @ 2.8GHz (max 3.8GHz), RAM 16Gb DDR4-2400, Ubuntu 16.04, Linux notebook 4.13.0-32-generic ,
14
+ Oracle JDK 64-bit (builds 1.8.0_161-b12 and 9.0.4+11 accordingly)
15
15
16
16
## Goals
17
17
18
18
Initially this library was developed for requirements of real-time bidding in ad-tech and goals are simple:
19
19
- do parsing and serialization of JSON directly from UTF-8 bytes to your case classes and Scala collections and back but
20
- do it crazily fast w/o reflection, intermediate trees, strings or events, w/ minimum allocations and copying
20
+ do it crazily fast without runtime-reflection, intermediate AST-trees, strings or events, with minimum allocations and
21
+ copying
21
22
- do validation of UTF-8 encoding, JSON format and mapped values efficiently with clear reporting, do not replace
22
23
illegally encoded characters of string values by placeholder characters
23
24
- define in _ compile-time_ classes that will be instantiated during parsing to minimize probability of runtime issues,
24
25
generated sources can be inspected to prove that there are no security vulnerabilities during parsing
25
26
26
- It targets JDK 8+ w/o any platform restrictions.
27
+ It targets JDK 8+ without any platform restrictions.
27
28
28
29
Support of Scala.js and Scala Native is not a goal for the moment.
29
30
30
31
## Features and limitations
31
32
- JSON parsing from ` Array[Byte] ` or ` java.io.InputStream `
32
33
- JSON serialization to ` Array[Byte] ` or ` java.io.OutputStream `
33
- - Parsing of streaming JSON values and JSON arrays from ` java.io.InputStream ` w/o need of holding all parsed values
34
+ - Parsing of streaming JSON values and JSON arrays from ` java.io.InputStream ` without need of holding all parsed values
34
35
in the memory
35
36
- Support reading part of ` Array[Byte] ` by specifying of position and limit of reading from/to
36
37
- Support writing to pre-allocated ` Array[Byte] ` by specifying of position of writing from
@@ -45,7 +46,7 @@ Support of Scala.js and Scala Native is not a goal for the moment.
45
46
` java.util.UUID ` , ` java.time.* ` , and value classes for any of them
46
47
- Support of ADTs with sealed trait or sealed abstract class base and case classes or case objects as leaf classes,
47
48
using discriminator field with string type of value
48
- - Implicitly resolvable codecs for values and key codecs for map keys
49
+ - Implicitly resolvable values codecs for JSON values and key codecs for JSON object keys that are mapped to maps
49
50
- Support only acyclic graphs of class instances
50
51
- Fields with default values that defined in the constructor are optional, other fields are required (no special
51
52
annotation required)
@@ -60,7 +61,7 @@ Support of Scala.js and Scala Native is not a goal for the moment.
60
61
- No dependencies on extra libraries excluding Scala's ` scala-library ` and ` scala-reflect `
61
62
62
63
There are number of configurable options that can be set in compile-time:
63
- - Ability to read/write number of containers from/to string values
64
+ - Ability to read/write numbers of containers from/to string values
64
65
- Skipping of unexpected fields or throwing of parse exceptions
65
66
- Mapping function for names between case classes and JSON, including predefined functions which enforce
66
67
snake_case, kebab-case or camelCase names for all fields
@@ -70,7 +71,7 @@ There are number of configurable options that can be set in compile-time:
70
71
List of options that change parsing and serialization in runtime:
71
72
- Serialization of strings with escaped Unicode characters to be ASCII compatible
72
73
- Indenting of output and its step
73
- - Throwing of stackless parsing exceptions to greatly reduce impact on performance
74
+ - Throwing of stack-less parsing exceptions to greatly reduce impact on performance
74
75
- Turning off hex dumping of affected by error part of an internal byte buffer to reduce impact on performance
75
76
- Preferred size of internal buffers when parsing from ` InputStream ` or serializing to ` OutputStream `
76
77
@@ -167,20 +168,20 @@ sbt clean 'benchmark/jmh:run -prof jmh.extras.JFR -wi 10 -i 50 .*GoogleMapsAPI.*
167
168
On Linux the perf profiler can be used to see CPU event statistics normalized per ops:
168
169
169
170
``` sh
170
- sbt -no-colors clean ' benchmark/jmh:run -prof perfnorm .*TwitterAPI.*' > twitter_api_perfnorm.txt
171
+ sbt clean ' benchmark/jmh:run -prof perfnorm .*TwitterAPI.*'
171
172
```
172
173
173
174
Following command can be used to profile and print assembly code of hottest methods, but it requires [ setup of an
174
175
additional library to make PrintAssembly feature enabled] ( http://psy-lob-saw.blogspot.com/2013/01/java-print-assembly.html ) :
175
176
176
177
``` sh
177
- sbt -no-colors clean ' benchmark/jmh:run -prof perfasm -wi 10 -i 10 .*Adt.*readJsoniter.*' > read_adt_perfasm.txt
178
+ sbt clean ' benchmark/jmh:run -prof perfasm -wi 10 -i 10 .*Adt.*readJsoniter.*'
178
179
```
179
180
180
- To see throughput with allocation rate of generated codecs run benchmarks with GC profiler using following command:
181
+ To see throughput with allocation rate of generated codecs run benchmarks with GC profiler using the following command:
181
182
182
183
``` sh
183
- sbt -no-colors clean ' benchmark/jmh:run -prof gc .*Benchmark.*' > gc.txt
184
+ sbt clean ' benchmark/jmh:run -prof gc .*Benchmark.*'
184
185
```
185
186
186
187
Results of benchmark can be stored in different formats: * .csv, * .json, etc. All supported formats can be listed by:
0 commit comments