You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It should be possible to query the registry to read feature metadata using get_feature_view etc in an environment that doesn't necessarily have the offline datasource compute set up (ie PySpark installed), for example for an application that uses only retrieval from the online store.
Current Behavior
fs.get_feature_view("frank_cf_versioned")
Traceback (most recent call last):
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/importer.py", line 26, in import_class
module = importlib.import_module(module_name)
File "/Users/daniel.bunting/.pyenv/versions/3.10.13/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/infra/offline_stores/contrib/spark_offline_store/spark_source.py", line 7, in <module>
from pyspark.sql import SparkSession
File "/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/_pydev_bundle/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, *args, **kwargs)
ModuleNotFoundError: No module named 'pyspark'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/pydevconsole.py", line 364, in runcode
coro = func()
File "<input>", line 1, in <module>
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/usage.py", line 299, in wrapper
raise exc.with_traceback(traceback)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/usage.py", line 288, in wrapper
return func(*args, **kwargs)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/feature_store.py", line 405, in get_feature_view
return self._get_feature_view(name, allow_registry_cache=allow_registry_cache)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/feature_store.py", line 413, in _get_feature_view
feature_view = self._registry.get_feature_view(
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/infra/registry/registry.py", line 551, in get_feature_view
return proto_registry_utils.get_feature_view(registry_proto, name, project)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/infra/registry/proto_registry_utils.py", line 65, in get_feature_view
return FeatureView.from_proto(feature_view_proto)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/typeguard/__init__.py", line 1033, in wrapper
retval = func(*args, **kwargs)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/feature_view.py", line 380, in from_proto
batch_source = DataSource.from_proto(feature_view_proto.spec.batch_source)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/typeguard/__init__.py", line 1033, in wrapper
retval = func(*args, **kwargs)
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/data_source.py", line 300, in from_proto
cls = get_data_source_class_from_type(_DATA_SOURCE_OPTIONS[data_source_type])
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/repo_config.py", line 487, in get_data_source_class_from_type
return import_class(module_name, config_class_name, "DataSource")
File "/Users/daniel.bunting/Library/Caches/pypoetry/virtualenvs/asosrecommendations-tQUUjZTb-py3.10/lib/python3.10/site-packages/feast/importer.py", line 31, in import_class
raise FeastModuleImportError(module_name, class_name) from e
feast.errors.FeastModuleImportError: Could not import module 'feast.infra.offline_stores.contrib.spark_offline_store.spark_source' while attempting to load class 'SparkSource'
Steps to reproduce
Create a FeatureView with a SparkSource offline data source. Call the get_feature_view method from an environment without PySpark installed
Specifications
Version:
Platform:
Subsystem:
Possible Solution
Other offline data sources defer importing data source specific packages util required
The text was updated successfully, but these errors were encountered:
HI @dnlbunting , the log say that you do not have pyspark library iinstalled, so that the reason behind you can not get feature out of the feature view, so please check again your venv
@ElliotNguyen68 What the author means is that get_data_source shouldn't fail even if pyspark is not installed. This should just be a matter of moving pyspark import to the method where it's actually used instead of doing it at the top of the file. Looks like he already fixed this in #3873 but got closed because of a DCO check.
Expected Behavior
It should be possible to query the registry to read feature metadata using get_feature_view etc in an environment that doesn't necessarily have the offline datasource compute set up (ie PySpark installed), for example for an application that uses only retrieval from the online store.
Current Behavior
Steps to reproduce
Create a FeatureView with a SparkSource offline data source. Call the get_feature_view method from an environment without PySpark installed
Specifications
Possible Solution
Other offline data sources defer importing data source specific packages util required
The text was updated successfully, but these errors were encountered: