-
-
Notifications
You must be signed in to change notification settings - Fork 7
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
4bbc3cf
commit 4c2ccba
Showing
6 changed files
with
92 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,72 @@ | ||
# Databricks notebook source | ||
# MAGIC %md | ||
# MAGIC ## Sample ETL Process with PySpark and Spark SQL | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 1: Import necessary libraries | ||
from pyspark.sql import SparkSession | ||
from pyspark.sql.functions import col, when | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 2: Create a SparkSession (automatically available in Databricks) | ||
spark = SparkSession.builder.appName("ETL Example").getOrCreate() | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 3: Load Data | ||
# Loading a sample CSV file into a DataFrame | ||
df = spark.read.csv("/databricks-datasets/airlines/part-00000", header=True, inferSchema=True) | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 4: Data Exploration | ||
# Display the first few rows of the DataFrame | ||
df.show(5) | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 5: Data Transformation with PySpark | ||
# Let's clean up some data, e.g., replacing null values in the 'ArrDelay' column with 0 | ||
df_cleaned = df.withColumn("ArrDelay", when(col("ArrDelay").isNull(), 0).otherwise(col("ArrDelay"))) | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 6: Data Transformation with Spark SQL | ||
# Register the DataFrame as a SQL temporary view | ||
df_cleaned.createOrReplaceTempView("flights") | ||
|
||
# COMMAND ---------- | ||
|
||
# Use Spark SQL to perform a simple transformation | ||
transformed_df = spark.sql(""" | ||
SELECT | ||
Year, | ||
Month, | ||
DayOfMonth, | ||
Carrier, | ||
Origin, | ||
Dest, | ||
ArrDelay | ||
FROM flights | ||
WHERE ArrDelay > 15 | ||
""") | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 7: Show the results of the transformation | ||
transformed_df.show(5) | ||
|
||
# COMMAND ---------- | ||
|
||
# Step 8: Save the transformed data | ||
# Saving the transformed data to a Parquet file | ||
transformed_df.write.mode("overwrite").parquet("/tmp/transformed_flights") | ||
|
||
# COMMAND ---------- | ||
|
||
# MAGIC %md | ||
# MAGIC ## Conclusion | ||
# MAGIC This notebook demonstrated how to load, transform, and save data using both PySpark and Spark SQL in Databricks. You can adapt this example to fit your specific ETL requirements. | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
# Get all network adapters | ||
$adapters = Get-NetAdapter | ||
|
||
# Filter out virtual and disabled adapters | ||
$activeAdapters = $adapters | Where-Object { $_.Physical -eq $true -and $_.Status -eq 'Up' } | ||
|
||
# Select relevant properties and format the output | ||
$report = $activeAdapters | Select-Object Name, InterfaceDescription, Status, MacAddress, LinkSpeed | Format-Table -AutoSize | Out-String | ||
|
||
# Write the output to a file | ||
$report | Out-File -FilePath "NetworkAdaptersReport.txt" |
Binary file not shown.
Binary file not shown.