首页 / 浏览问题 / 云GIS / 问题详情
分布式分析作业出错
hqd
13EXP 2022年07月31日
在本地虚拟机上搭建spark集群,在本地win环境下的iserve连接该集群,进行分布式分析作业出错

错误原因:

2022-7-31 15:02:01 - WARN - 分布式分析作业fa232c7b_33de_42f0_8726_60206116f426分析失败,原因是:Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 20, 192.168.10.107, executor 1): java.lang.IllegalArgumentException: open datasource failed

at com.supermap.bdt.rddprovider.sdx.SDXUtils$.getDataSource(SDXUtils.scala:96)

at com.supermap.bdt.rddprovider.sdx.SDXFeatureRDD.compute(SDXFeatureRDD.scala:263)

at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)

at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

at org.apache.spark.scheduler.Task.run(Task.scala:109)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

at java.lang.Thread.run(Thread.java:750)

1个回答

您好,根据报错提示打开数据源失败,您检查一下分布式分析作业的数据源是否能正常打开。
3,148EXP 2022年08月01日
...