Skip to content

Commit eed9e39

Browse files
nlarewmongoKart
authored andcommitted
Add missing meta descriptions (#251)
(cherry picked from commit c8f8c7b)
1 parent d5af8ba commit eed9e39

17 files changed

+39
-0
lines changed

source/api-docs.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
API Documentation
33
=================
44

5+
.. meta::
6+
:description: Explore API documentation for the Spark Connector compatible with Scala 2.13 and 2.12.
7+
58
- `Spark Connector for Scala 2.13 <https://www.javadoc.io/doc/org.mongodb.spark/{+artifact-id-2-13+}/{+current-version+}/index.html>`__
69
- `Spark Connector for Scala 2.12 <https://www.javadoc.io/doc/org.mongodb.spark/{+artifact-id-2-12+}/{+current-version+}/index.html>`__
710

source/batch-mode.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
Batch Mode
33
==========
44

5+
.. meta::
6+
:description: Explore how to use the Spark Connector to read and write data to MongoDB in batch mode using Spark's Dataset and DataFrame APIs.
7+
58
.. contents:: On this page
69
:local:
710
:backlinks: none

source/batch-mode/batch-read-config.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ Batch Read Configuration Options
1616

1717
.. meta::
1818
:keywords: partitioner, customize, settings
19+
:description: Configure batch read options for the Spark Connector, including connection URI, database, collection, and partitioner settings for efficient data processing.
1920

2021
.. _spark-batch-input-conf:
2122

source/batch-mode/batch-read.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@
44
Read from MongoDB in Batch Mode
55
===============================
66

7+
.. meta::
8+
:description: Learn how to read data from MongoDB in batch mode using Spark, including configuration settings, schema inference, and applying filters for efficient data retrieval.
9+
710
.. toctree::
811
:caption: Batch Read Configuration Options
912

source/batch-mode/batch-write-config.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@
44
Batch Write Configuration Options
55
=================================
66

7+
.. meta::
8+
:description: Configure batch write operations to MongoDB using various properties like connection URI, database, collection, and write concern options.
9+
710
.. contents:: On this page
811
:local:
912
:backlinks: none

source/batch-mode/batch-write.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@
44
Write to MongoDB in Batch Mode
55
==============================
66

7+
.. meta::
8+
:description: Learn how to write data to MongoDB in batch mode using the Spark Connector, specifying format and configuration settings for Java, Python, and Scala.
9+
710
.. toctree::
811
:caption: Batch Write Configuration Options
912

source/configuration.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@
44
Configuring Spark
55
=================
66

7+
.. meta::
8+
:description: Configure read and write operations in Spark using `SparkConf`, options maps, or system properties for batch and streaming modes.
9+
710
.. contents:: On this page
811
:local:
912
:backlinks: none

source/faq.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
FAQ
33
===
44

5+
.. meta::
6+
:description: Find solutions for achieving data locality, resolving pipeline stage errors, using mTLS for authentication, and sharing a MongoClient instance across threads with the Spark Connector.
7+
58
How can I achieve data locality?
69
--------------------------------
710

source/getting-started.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ Getting Started with the {+connector-short+}
1414

1515
.. meta::
1616
:keywords: quick start, tutorial, code example
17+
:description: Get started with the Spark Connector by setting up dependencies, configuring connections, and integrating with platforms like Amazon EMR, Databricks, Docker, and Kubernetes.
1718

1819
Prerequisites
1920
-------------

source/index.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
MongoDB Connector for Spark
33
===========================
44

5+
.. meta::
6+
:description: Integrate MongoDB with Apache Spark using the MongoDB Connector for Spark, supporting Spark Structured Streaming.
7+
58
.. toctree::
69
:titlesonly:
710

0 commit comments

Comments
 (0)