pyspark.sql.SparkSession¶
-
class
pyspark.sql.
SparkSession
(sparkContext, jsparkSession=None)[source]¶ The entry point to programming Spark with the Dataset and DataFrame API.
A SparkSession can be used create
DataFrame
, registerDataFrame
as tables, execute SQL over tables, cache tables, and read parquet files. To create aSparkSession
, use the following builder pattern:-
builder
¶ A class attribute having a
Builder
to constructSparkSession
instances.
Examples
>>> spark = SparkSession.builder \ ... .master("local") \ ... .appName("Word Count") \ ... .config("spark.some.config.option", "some-value") \ ... .getOrCreate()
>>> from datetime import datetime >>> from pyspark.sql import Row >>> spark = SparkSession(sc) >>> allTypes = sc.parallelize([Row(i=1, s="string", d=1.0, l=1, ... b=True, list=[1, 2, 3], dict={"s": 0}, row=Row(a=1), ... time=datetime(2014, 8, 1, 14, 1, 5))]) >>> df = allTypes.toDF() >>> df.createOrReplaceTempView("allTypes") >>> spark.sql('select i+1, d+1, not b, list[1], dict["s"], time, row.a ' ... 'from allTypes where b and i > 0').collect() [Row((i + 1)=2, (d + 1)=2.0, (NOT b)=False, list[1]=2, dict[s]=0, time=datetime.datetime(2014, 8, 1, 14, 1, 5), a=1)] >>> df.rdd.map(lambda x: (x.i, x.s, x.d, x.l, x.b, x.time, x.row.a, x.list)).collect() [(1, 'string', 1.0, 1, True, datetime.datetime(2014, 8, 1, 14, 1, 5), 1, [1, 2, 3])]
Methods
createDataFrame
(data[, schema, …])Creates a
DataFrame
from anRDD
, a list or apandas.DataFrame
.Returns the active
SparkSession
for the current thread, returned by the builderReturns a new
SparkSession
as new session, that has separate SQLConf, registered temporary views and UDFs, but sharedSparkContext
and table cache.range
(start[, end, step, numPartitions])Create a
DataFrame
with singlepyspark.sql.types.LongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with step valuestep
.sql
(sqlQuery)Returns a
DataFrame
representing the result of the given query.stop
()Stop the underlying
SparkContext
.table
(tableName)Returns the specified table as a
DataFrame
.Attributes
A class attribute having a
Builder
to constructSparkSession
instances.Interface through which the user may create, drop, alter or query underlying databases, tables, functions, etc.
Runtime configuration interface for Spark.
Returns a
DataFrameReader
that can be used to read data in as aDataFrame
.Returns a
DataStreamReader
that can be used to read data streams as a streamingDataFrame
.Returns the underlying
SparkContext
.Returns a
StreamingQueryManager
that allows managing all theStreamingQuery
instances active on this context.Returns a
UDFRegistration
for UDF registration.The version of Spark on which this application is running.
-