Skip to content

CSVDecoder: No Long and Int out of range exceptions #485

@Pavel38l

Description

@Pavel38l

I have a problem when I try to process a CSV file entered by the user.
If the numeric values ​​are outside the limits of Int or Long, then there is no way I can notify the user about incorrect values ​​that are too large.

Values ​​are corrupted by _getBigInteger().intValue() and _getBigInteger().longValue() in CsvDecoder
And no error occurs
Is it possible to use the methods _getBigInteger().intValueExact() and _getBigInteger().longValueExact() which check boundaries and cause an error?

Example (Kotlin):
CSV file:

testInt,testLong
111111111111111111111111111111111111111111,2222222222222222222222222222222222222222

Entity:

import com.fasterxml.jackson.annotation.JsonProperty

open class TestEntity(
        @JsonProperty("testInt")  val testInt: Int,
        @JsonProperty("testLong") val testLong: Long,
)
import com.fasterxml.jackson.dataformat.csv.CsvMapper
import com.fasterxml.jackson.dataformat.csv.CsvParser
import java.io.File
import java.io.InputStream

class TestClass {
    fun readConvertFile() {
        val file = File("example.csv")
        val entities = readCsv(TestEntity::class.java, file.inputStream())
        println(entities[0].testInt)
        println(entities[0].testLong)
    }

    companion object {
        private val csvMapper = CsvMapper()
            .enable(CsvParser.Feature.SKIP_EMPTY_LINES)
            .enable(CsvParser.Feature.TRIM_SPACES)
            .enable(CsvParser.Feature.WRAP_AS_ARRAY)
            .enable(CsvParser.Feature.INSERT_NULLS_FOR_MISSING_COLUMNS)

        fun <T> readCsv(classArg: Class<T>, stream: InputStream): List<T> {
            val schema = csvMapper.schemaFor(classArg).withHeader().withColumnReordering(true).withNullValue("")
            val reader = csvMapper.readerFor(classArg).with(schema)
            return reader.readValues<T>(stream).readAll()
        }
    }
}

ER: some error message like jackson ObjectMapper Numeric value (111111111111111111111111111111111111111111) out of range of int
AR: Values corrupted

-954437177
-1121560256615750770

I use

<dependency>
    <groupId>com.fasterxml.jackson.dataformat</groupId>
    <artifactId>jackson-dataformat-csv</artifactId>
    <version>2.17.2</version>
</dependency>

Metadata

Metadata

Assignees

No one assigned

    Labels

    2.18Fix or feature targeted at 2.18 releasecsv

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions