2017-09-11 6 views
0

Bien que les données de chargement de Kassandra, Spark plaids java.lang.IllegalArgumentException à cause des colonnes dans la table est de type LocalDateSpark ne peut pas convertir LocalDate tandis que les données de chargement de Kassandra

Ci-dessous est mon code référentiel: -

private transient Sparkconf sparkConf; 
private SparkContext sc; 

public List<Person> findAll() { 
    sparkConf=new SparkConf(); 
    sparkConf.setAppName("investhry"); 
    sparkConf.setMaster("local[4]"); 
    sparkConf.set("spark.cassandra.connection.host", "localhost"); 
    sc=new JavaSparkContext(sparkConf); 
    JavaRDD<Person> personJavaRDD=javaFunctions(sc).cassandraTable("java_api","person",mapRowTo(Person.class)); 
    List<Person> people=personJavaRDD.collect(); 
    sc.stop(); 
    return people; 
} 

Ci-dessous mon erreur: -

java.lang.IllegalArgumentException: Unsupported type: java.time.LocalDate 
at com.datastax.spark.connector.types.TypeConverter$.forCollectionType(TypeConverter.scala:930) 
at com.datastax.spark.connector.types.TypeConverter$.forType(TypeConverter.scala:943) 
at com.datastax.spark.connector.types.TypeConverter$.forType(TypeConverter.scala:962) 
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.converter(GettableDataToMappedTypeConverter.scala:108) 
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.com$datastax$spark$connector$rdd$reader$GettableDataToMappedTypeConverter$$converter(GettableDataToMappedTypeConverter.scala:117) 
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter$$anonfun$7.apply(GettableDataToMappedTypeConverter.scala:184) 
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter$$anonfun$7.apply(GettableDataToMappedTypeConverter.scala:181) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.MapLike$DefaultKeySet.foreach(MapLike.scala:174) 
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47) 
at scala.collection.SetLike$class.map(SetLike.scala:93) 
at scala.collection.AbstractSet.map(Set.scala:47) 
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.<init>(GettableDataToMappedTypeConverter.scala:181) 
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReader.<init>(ClassBasedRowReader.scala:21) 
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderFactory.rowReader(ClassBasedRowReader.scala:44) 
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderFactory.rowReader(ClassBasedRowReader.scala:39) 
at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.rowReader(CassandraTableRowReaderProvider.scala:48) 
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.rowReader$lzycompute(CassandraTableScanRDD.scala:62) 
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.rowReader(CassandraTableScanRDD.scala:62) 
at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:138) 

Répondre

0

Essayez d'utiliser java.util.Date au lieu de LocalDate dans votre classe Person, Ceci devrait régler votre problème.