Skip to content

Commit

Permalink
Trying to import pyspark lazily to avoid the dependency on the librar…
Browse files Browse the repository at this point in the history
…y. This change will fix the issue - feast-dev#3872

Picked up changes from the PR - feast-dev#3873

Signed-off-by: Lokesh Rangineni <[email protected]>
  • Loading branch information
lokeshrangineni committed Apr 10, 2024
1 parent f09c612 commit bc19cc4
Showing 1 changed file with 7 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@
from enum import Enum
from typing import Any, Callable, Dict, Iterable, Optional, Tuple

from pyspark.sql import SparkSession

from feast import flags_helper
from feast.data_source import DataSource
from feast.errors import DataSourceNoNameException, DataSourceNotFoundException
Expand Down Expand Up @@ -162,6 +160,13 @@ def get_table_column_names_and_types(

def get_table_query_string(self) -> str:
"""Returns a string that can directly be used to reference this table in SQL"""
try:
from pyspark.sql import SparkSession
except ImportError as e:
from feast.errors import FeastExtrasDependencyImportError

raise FeastExtrasDependencyImportError("spark", str(e))

if self.table:
# Backticks make sure that spark sql knows this a table reference.
table = ".".join([f"`{x}`" for x in self.table.split(".")])
Expand Down

0 comments on commit bc19cc4

Please sign in to comment.