digraph G {
0 [labelType="html" label="<br><b>AdaptiveSparkPlan</b><br><br>"];
1 [labelType="html" label="<b>Execute InsertIntoHadoopFsRelationCommand</b><br><br>task commit time: 1 ms<br>number of written files: 1<br>job commit time: 10 ms<br>number of output rows: 5,188<br>number of dynamic part: 0<br>written output: 361.7 KiB"];
2 [labelType="html" label="<br><b>WriteFiles</b><br><br>"];
3 [labelType="html" label="<b>Exchange</b><br><br>shuffle records written: 5,188<br>local merged chunks fetched: 0<br>shuffle write time: 1 ms<br>remote merged bytes read: 0.0 B<br>local merged blocks fetched: 0<br>corrupt merged block chunks: 0<br>remote merged reqs duration: 0 ms<br>remote merged blocks fetched: 0<br>records read: 5,188<br>local bytes read: 150.5 KiB<br>fetch wait time: 0 ms<br>remote bytes read: 0.0 B<br>merged fetch fallback count: 0<br>local blocks read: 1<br>remote merged chunks fetched: 0<br>remote blocks read: 0<br>data size: 526.9 KiB<br>local merged bytes read: 0.0 B<br>number of partitions: 1<br>remote reqs duration: 0 ms<br>remote bytes read to disk: 0.0 B<br>shuffle bytes written: 150.5 KiB"];
subgraph cluster4 {
isCluster="true";
label="WholeStageCodegen (1)\n \nduration: 123 ms";
5 [labelType="html" label="<b>Scan ExistingRDD</b><br><br>number of output rows: 5,188"];
}
1->0;
2->1;
3->2;
5->3;
}
6
AdaptiveSparkPlan isFinalPlan=true
Execute InsertIntoHadoopFsRelationCommand file:/data/output/export/csv/685ff93a-f4a1-40cd-8abb-db48fd35f7cb, false, CSV, [path=file:///data/output/export/csv/685ff93a-f4a1-40cd-8abb-db48fd35f7cb, nullValue=, ignoreLeadingWhiteSpace=true, quoteAll=false, sep=,, quote=", emptyValue=, ignoreTrailingWhiteSpace=true, escape=\, charset=UTF-8, lineSep=
, header=true], ErrorIfExists, [N° Facture, Montant Facture, Magasin, Nb Article, Code Article, Vendeur, FactureDu, N° Client]
WriteFiles
Exchange SinglePartition, REPARTITION_BY_NUM, [plan_id=15288]
Scan ExistingRDD[N° Facture#142397,Montant Facture#142398,Magasin#142399,Nb Article#142400,Code Article#142401,Vendeur#142402,FactureDu#142403,N° Client#142404]
WholeStageCodegen (1)
== Physical Plan ==
AdaptiveSparkPlan (9)
+- == Final Plan ==
Execute InsertIntoHadoopFsRelationCommand (5)
+- WriteFiles (4)
+- ShuffleQueryStage (3), Statistics(sizeInBytes=526.9 KiB, rowCount=5.19E+3)
+- Exchange (2)
+- * Scan ExistingRDD (1)
+- == Initial Plan ==
Execute InsertIntoHadoopFsRelationCommand (8)
+- WriteFiles (7)
+- Exchange (6)
+- Scan ExistingRDD (1)
(1) Scan ExistingRDD [codegen id : 1]
Output [8]: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404]
Arguments: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404], MapPartitionsRDD[2899] at createDataFrame at AbsExportExecutor.java:55, ExistingRDD, UnknownPartitioning(0)
(2) Exchange
Input [8]: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404]
Arguments: SinglePartition, REPARTITION_BY_NUM, [plan_id=15288]
(3) ShuffleQueryStage
Output [8]: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404]
Arguments: 0
(4) WriteFiles
Input [8]: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404]
(5) Execute InsertIntoHadoopFsRelationCommand
Input: []
Arguments: file:/data/output/export/csv/685ff93a-f4a1-40cd-8abb-db48fd35f7cb, false, CSV, [path=file:///data/output/export/csv/685ff93a-f4a1-40cd-8abb-db48fd35f7cb, nullValue=, ignoreLeadingWhiteSpace=true, quoteAll=false, sep=,, quote=", emptyValue=, ignoreTrailingWhiteSpace=true, escape=\, charset=UTF-8, lineSep=
, header=true], ErrorIfExists, [N° Facture, Montant Facture, Magasin, Nb Article, Code Article, Vendeur, FactureDu, N° Client]
(6) Exchange
Input [8]: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404]
Arguments: SinglePartition, REPARTITION_BY_NUM, [plan_id=15281]
(7) WriteFiles
Input [8]: [N° Facture#142397, Montant Facture#142398, Magasin#142399, Nb Article#142400, Code Article#142401, Vendeur#142402, FactureDu#142403, N° Client#142404]
(8) Execute InsertIntoHadoopFsRelationCommand
Input: []
Arguments: file:/data/output/export/csv/685ff93a-f4a1-40cd-8abb-db48fd35f7cb, false, CSV, [path=file:///data/output/export/csv/685ff93a-f4a1-40cd-8abb-db48fd35f7cb, nullValue=, ignoreLeadingWhiteSpace=true, quoteAll=false, sep=,, quote=", emptyValue=, ignoreTrailingWhiteSpace=true, escape=\, charset=UTF-8, lineSep=
, header=true], ErrorIfExists, [N° Facture, Montant Facture, Magasin, Nb Article, Code Article, Vendeur, FactureDu, N° Client]
(9) AdaptiveSparkPlan
Output: []
Arguments: isFinalPlan=true