Packages

c

pl.touk.nussknacker.engine.schemedkafka.schema

DefaultAvroSchemaEvolution

class DefaultAvroSchemaEvolution extends AvroSchemaEvolution with DatumReaderWriterMixin with RecordDeserializer

It's base implementation of AvroSchemaEvolution. In this case strategy to evolve record to schema is as follows:

serialize record to record schema -> deserialize record to provided schema (final schema)

This strategy is based on Confluent implementation of: serialization and deserialization method. But we don't allocate bytes for MagicByte and Id, because we don't need it.

For now it's easiest way to convert GenericContainer record to wanted schema.

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DefaultAvroSchemaEvolution
  2. RecordDeserializer
  3. DatumReaderWriterMixin
  4. AvroSchemaEvolution
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DefaultAvroSchemaEvolution()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def alignRecordToSchema(record: GenericContainer, schema: Schema): Any
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def canBeEvolved(record: GenericContainer, schema: Schema): Boolean
  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  8. def createDatumReader(writerSchema: Schema, readerSchema: Schema, useSchemaReflection: Boolean, useSpecificAvroReader: Boolean): DatumReader[AnyRef]
    Definition Classes
    DatumReaderWriterMixin
  9. def createDatumWriter(record: Any, schema: Schema, useSchemaReflection: Boolean): GenericDatumWriter[Any]
    Definition Classes
    DatumReaderWriterMixin
  10. final val decoderFactory: DecoderFactory
    Attributes
    protected
    Definition Classes
    DefaultAvroSchemaEvolutionRecordDeserializer
  11. def deserializePayloadToSchema(payload: Array[Byte], writerSchema: Schema, readerSchema: Schema): Any

    It's copy paste from AbstractKafkaAvroDeserializer#DeserializationContext.read with some modification.

    It's copy paste from AbstractKafkaAvroDeserializer#DeserializationContext.read with some modification. We pass there record buffer data and schema which will be used to convert record.

    Attributes
    protected
  12. def deserializeRecord(readerSchemaData: RuntimeSchemaData[AvroSchema], reader: DatumReader[AnyRef], buffer: ByteBuffer, bufferDataStart: Int): AnyRef
    Attributes
    protected
    Definition Classes
    RecordDeserializer
  13. final val encoderFactory: EncoderFactory
    Attributes
    protected
  14. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  16. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  17. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  18. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  22. val primitives: Map[String, Schema]

    We use it on checking writerSchema is primitive - on creating DatumReader (createDatumReader).

    We use it on checking writerSchema is primitive - on creating DatumReader (createDatumReader).

    Attributes
    protected
    Definition Classes
    DatumReaderWriterMixin
  23. def schemaIdSerializationEnabled: Boolean
    Attributes
    protected
    Definition Classes
    DefaultAvroSchemaEvolutionRecordDeserializer
  24. def serializeRecord(record: GenericContainer): Array[Byte]

    Record serialization method, kind of copy paste from AbstractKafkaAvroSerializer#DeserializationContext.read.

    Record serialization method, kind of copy paste from AbstractKafkaAvroSerializer#DeserializationContext.read. We use confluent serialization mechanism without some specifics features like:

    - fetching schema from registry - fetching schema Id - we don't serialize MagicByte and version

    To serialization we use schema from record.

    Attributes
    protected
  25. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  26. def toString(): String
    Definition Classes
    AnyRef → Any
  27. final val useSchemaReflection: Boolean(false)

    In future we can try to configure it

    In future we can try to configure it

    Attributes
    protected
  28. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  30. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated @deprecated
    Deprecated

    (Since version ) see corresponding Javadoc for more information.

Inherited from RecordDeserializer

Inherited from DatumReaderWriterMixin

Inherited from AvroSchemaEvolution

Inherited from AnyRef

Inherited from Any

Ungrouped