org.apache.spark.sql.execution.datasources.jdbc

SpliceRelation

Related Doc: package jdbc

case class SpliceRelation(jdbcOptions: JDBCOptions)(sqlContext: SQLContext, userSchema: Option[StructType]) extends BaseRelation with PrunedFilteredScan with InsertableRelation with Product with Serializable

Created by jleach on 4/7/17.

Linear Supertypes
Serializable, Serializable, Product, Equals, InsertableRelation, PrunedFilteredScan, BaseRelation, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SpliceRelation
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. InsertableRelation
  7. PrunedFilteredScan
  8. BaseRelation
  9. AnyRef
  10. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SpliceRelation(jdbcOptions: JDBCOptions)(sqlContext: SQLContext, userSchema: Option[StructType])

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  5. def buildScan(requiredColumns: Array[String], filters: Array[Filter]): RDD[Row]

    Definition Classes
    SpliceRelation → PrunedFilteredScan
  6. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  8. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  10. def insert(data: DataFrame, overwrite: Boolean): Unit

    Definition Classes
    SpliceRelation → InsertableRelation
  11. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  12. val jdbcOptions: JDBCOptions

  13. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  14. val needConversion: Boolean

    Definition Classes
    SpliceRelation → BaseRelation
  15. final def notify(): Unit

    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  17. def schema: StructType

    Generates a SparkSQL schema object so SparkSQL knows what is being provided by this BaseRelation.

    Generates a SparkSQL schema object so SparkSQL knows what is being provided by this BaseRelation.

    returns

    schema generated from the Splice Machine table's schema

    Definition Classes
    SpliceRelation → BaseRelation
  18. def sizeInBytes: Long

    Definition Classes
    BaseRelation
  19. val sqlContext: SQLContext

    Definition Classes
    SpliceRelation → BaseRelation
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  21. def unhandledFilters(filters: Array[Filter]): Array[Filter]

    Definition Classes
    SpliceRelation → BaseRelation
  22. var userSchema: Option[StructType]

  23. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from InsertableRelation

Inherited from PrunedFilteredScan

Inherited from BaseRelation

Inherited from AnyRef

Inherited from Any

Ungrouped