About 36,800 results
Open links in new tab
  1. PySpark Overview — PySpark 4.0.1 documentation - Apache Spark

    Spark Connect is a client-server architecture within Apache Spark that enables remote connectivity to Spark clusters from any application. PySpark provides the client for the Spark Connect server, …

  2. RDD Programming Guide - Spark 4.0.1 Documentation

    Spark supports two types of shared variables: broadcast variables, which can be used to cache a value in memory on all nodes, and accumulators, which are variables that are only “added” to, such as …

  3. Examples - Apache Spark

    Spark allows you to perform DataFrame operations with programmatic APIs, write SQL, perform streaming analyses, and do machine learning. Spark saves you from learning multiple frameworks …

  4. Configuration - Spark 4.0.1 Documentation

    Spark provides three locations to configure the system: Spark properties control most application parameters and can be set by using a SparkConf object, or through Java system properties. …

  5. Spark Streaming - Spark 4.0.1 Documentation - Apache Spark

    Spark Streaming is an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant stream processing of live data streams. Data can be ingested from many sources like Kafka, …

  6. Running Spark on Kubernetes - Spark 4.0.1 Documentation

    Spark executors must be able to connect to the Spark driver over a hostname and a port that is routable from the Spark executors. The specific network configuration that will be required for Spark to work in …

  7. JDBC To Other Databases - Spark 4.0.1 Documentation

    The below table describes the data type conversions from Spark SQL Data Types to MySQL data types, when creating, altering, or writing data to a MySQL table using the built-in jdbc data source with the …

  8. Web UI - Spark 4.0.1 Documentation

    Apache Spark provides a suite of web user interfaces (UIs) that you can use to monitor the status and resource consumption of your Spark cluster. Table of Contents

  9. Spark SQL, Built-in Functions

    Jul 30, 2009 · When SQL config 'spark.sql.parser.escapedStringLiterals' is enabled, it falls back to Spark 1.6 behavior regarding string literal parsing. For example, if the config is enabled, the pattern to …

  10. Feature Extraction and Transformation - RDD-based API - Apache Spark

    Find full example code at "examples/src/main/scala/org/apache/spark/examples/mllib/NormalizerExample.scala" in the Spark …