diff --git a/docs/storage-openstack-swift.md b/docs/storage-openstack-swift.md
index ad0d3c6d7deaf..54e5bdce17672 100644
--- a/docs/storage-openstack-swift.md
+++ b/docs/storage-openstack-swift.md
@@ -6,7 +6,7 @@ title: Accessing OpenStack Swift from Spark
Spark's support for Hadoop InputFormat allows it to process data in OpenStack Swift using the
same URI formats as in Hadoop. You can specify a path in Swift as input through a
URI of the form swift://container.PROVIDER/path
. You will also need to set your
-Swift security credentials, through core-sites.xml
or via
+Swift security credentials, through core-site.xml
or via
SparkContext.hadoopConfiguration
.
Current Swift driver requires Swift to use Keystone authentication method.
@@ -37,7 +37,7 @@ For example, for Maven support, add the following to the pom.xml
fi
# Configuration Parameters
-Create core-sites.xml
and place it inside /spark/conf
directory.
+Create core-site.xml
and place it inside /spark/conf
directory.
There are two main categories of parameters that should to be configured: declaration of the
Swift driver and the parameters that are required by Keystone.
@@ -100,7 +100,7 @@ contains a list of Keystone mandatory parameters. PROVIDER
can be a
For example, assume PROVIDER=SparkTest
and Keystone contains user tester
with password testing
-defined for tenant test
. Than core-sites.xml
should include:
+defined for tenant test
. Than core-site.xml
should include:
{% highlight xml %}
@@ -146,7 +146,7 @@ Notice that
fs.swift.service.PROVIDER.tenant
,
fs.swift.service.PROVIDER.username
,
fs.swift.service.PROVIDER.password
contains sensitive information and keeping them in
-core-sites.xml
is not always a good approach.
-We suggest to keep those parameters in core-sites.xml
for testing purposes when running Spark
+core-site.xml
is not always a good approach.
+We suggest to keep those parameters in core-site.xml
for testing purposes when running Spark
via spark-shell
.
For job submissions they should be provided via sparkContext.hadoopConfiguration
.