public class ExplainCommand extends SparkPlan implements Command, scala.Product, scala.Serializable
Note that this command takes in a logical plan, runs the optimizer on the logical plan (but do NOT actually execute it).
:: DeveloperApi ::
| Constructor and Description |
|---|
ExplainCommand(org.apache.spark.sql.catalyst.plans.logical.LogicalPlan logicalPlan,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output,
boolean extended,
SQLContext context) |
| Modifier and Type | Method and Description |
|---|---|
RDD<org.apache.spark.sql.catalyst.expressions.Row> |
execute()
Runs this query returning the result as an RDD.
|
boolean |
extended() |
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan |
logicalPlan() |
scala.collection.immutable.List<SQLContext> |
otherCopyArgs() |
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> |
output() |
codegenEnabled, executeCollect, makeCopy, outputPartitioning, requiredChildDistributionexpressions, org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionDown$1, org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionUp$1, outputSet, printSchema, schema, schemaString, transformAllExpressions, transformExpressions, transformExpressionsDown, transformExpressionsUpapply, argString, asCode, children, collect, fastEquals, flatMap, foreach, generateTreeString, getNodeNumbered, id, map, mapChildren, nextId, nodeName, numberedTreeString, sameInstance, simpleString, stringArgs, toString, transform, transformChildrenDown, transformChildrenUp, transformDown, transformUp, treeString, withNewChildrenproductArity, productElement, productIterator, productPrefixinitialized, initializeIfNecessary, initializeLogging, initLock, isTraceEnabled, log_, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarningpublic ExplainCommand(org.apache.spark.sql.catalyst.plans.logical.LogicalPlan logicalPlan,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output,
boolean extended,
SQLContext context)
public org.apache.spark.sql.catalyst.plans.logical.LogicalPlan logicalPlan()
public scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output()
output in class org.apache.spark.sql.catalyst.plans.QueryPlan<SparkPlan>public boolean extended()
public RDD<org.apache.spark.sql.catalyst.expressions.Row> execute()
SparkPlanpublic scala.collection.immutable.List<SQLContext> otherCopyArgs()