首页 / 浏览问题 / 组件GIS / 问题详情
SDXReader如何读取oracleplus数据
3EXP 2018年07月19日

你好

   我现在想使用SDXReader.readFromDS读取存放在oracleplus里面的sdx+数据,想请问下如何进行访问,我使用如下方法,一直报错

val dsConnInfo = DSConnectionInfo(DatasourceType.OraclePlus,"10.10.0.249:1521/orcl", "C_JCDL_530000", "", Some("C_JCDL_530000"), Some("xxxx"))
val rdd = SDXReader.readFromDS(sc, dsConnInfo,"HYDA", numSlices = 1, isDriver = false)

报错错信息是

[bdt]: open datasource 10.10.0.249:1521/orcl failed
Exception in thread "main" java.lang.IllegalArgumentException:  dataset is null
    at com.supermap.bdt.io.sdx.SDXFeatureRDD.<init>(SDXFeatureRDD.scala:71)
    at com.supermap.bdt.io.sdx.SDXFeatureRDD$.apply(SDXFeatureRDD.scala:380)
    at com.supermap.bdt.io.sdx.SDXReader$.readFromDS(SDXReader.scala:370)
    at com.bzhcloud.spatial.process.GetData$.main(GetData.scala:55)
    at com.bzhcloud.spatial.process.GetData.main(GetData.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

1个回答

您好!请看http://qa.supermap.com/34535的回答。

3,362EXP 2018年07月19日
...