class DefaultAvroSchemaEvolution extends AvroSchemaEvolution with DatumReaderWriterMixin with RecordDeserializer
It's base implementation of AvroSchemaEvolution. In this case strategy to evolve record to schema is as follows:
serialize record to record schema -> deserialize record to provided schema (final schema)
This strategy is based on Confluent implementation of: serialization and deserialization method. But we don't allocate bytes for MagicByte and Id, because we don't need it.
For now it's easiest way to convert GenericContainer record to wanted schema.
- Alphabetic
- By Inheritance
- DefaultAvroSchemaEvolution
- RecordDeserializer
- DatumReaderWriterMixin
- AvroSchemaEvolution
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new DefaultAvroSchemaEvolution()
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
alignRecordToSchema(record: GenericContainer, schema: Schema): Any
- Definition Classes
- DefaultAvroSchemaEvolution → AvroSchemaEvolution
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
canBeEvolved(record: GenericContainer, schema: Schema): Boolean
- Definition Classes
- DefaultAvroSchemaEvolution → AvroSchemaEvolution
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
def
createDatumReader(writerSchema: Schema, readerSchema: Schema, useSchemaReflection: Boolean, useSpecificAvroReader: Boolean): DatumReader[AnyRef]
- Definition Classes
- DatumReaderWriterMixin
-
def
createDatumWriter(record: Any, schema: Schema, useSchemaReflection: Boolean): GenericDatumWriter[Any]
- Definition Classes
- DatumReaderWriterMixin
-
final
val
decoderFactory: DecoderFactory
- Attributes
- protected
- Definition Classes
- DefaultAvroSchemaEvolution → RecordDeserializer
-
def
deserializePayloadToSchema(payload: Array[Byte], writerSchema: Schema, readerSchema: Schema): Any
It's copy paste from AbstractKafkaAvroDeserializer#DeserializationContext.read with some modification.
It's copy paste from AbstractKafkaAvroDeserializer#DeserializationContext.read with some modification. We pass there record buffer data and schema which will be used to convert record.
- Attributes
- protected
-
def
deserializeRecord(readerSchemaData: RuntimeSchemaData[AvroSchema], reader: DatumReader[AnyRef], buffer: ByteBuffer, bufferDataStart: Int): AnyRef
- Attributes
- protected
- Definition Classes
- RecordDeserializer
-
final
val
encoderFactory: EncoderFactory
- Attributes
- protected
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
val
primitives: Map[String, Schema]
We use it on checking writerSchema is primitive - on creating DatumReader (createDatumReader).
We use it on checking writerSchema is primitive - on creating DatumReader (createDatumReader).
- Attributes
- protected
- Definition Classes
- DatumReaderWriterMixin
-
def
schemaIdSerializationEnabled: Boolean
- Attributes
- protected
- Definition Classes
- DefaultAvroSchemaEvolution → RecordDeserializer
-
def
serializeRecord(record: GenericContainer): Array[Byte]
Record serialization method, kind of copy paste from AbstractKafkaAvroSerializer#DeserializationContext.read.
Record serialization method, kind of copy paste from AbstractKafkaAvroSerializer#DeserializationContext.read. We use confluent serialization mechanism without some specifics features like:
- fetching schema from registry - fetching schema Id - we don't serialize MagicByte and version
To serialization we use schema from record.
- Attributes
- protected
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
val
useSchemaReflection: Boolean(false)
In future we can try to configure it
In future we can try to configure it
- Attributes
- protected
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated @deprecated
- Deprecated
(Since version ) see corresponding Javadoc for more information.