Skip to content

Simplifying the discovery and usage of machine-learning ready datasets in materials science and chemistry

License

Notifications You must be signed in to change notification settings

MLMI2-CSSI/foundry

Repository files navigation

PyPI Tests Tests NSF-1931306

Foundry-ML simplifies the discovery and usage of ML-ready datasets in materials science and chemistry providing a simple API to access even complex datasets.

  • Load ML-ready data with just a few lines of code
  • Work with datasets in local or cloud environments.
  • Publish your own datasets with Foundry to promote community usage
  • (in progress) Run published ML models without hassle

Learn more and see our available datasets on Foundry-ML.org

Documentation

Information on how to install and use Foundry is available in our documentation here.

DLHub documentation for model publication and running information can be found here.

Quick Start

Install Foundry-ML via command line with: pip install foundry_ml

You can use the following code to import and instantiate Foundry-ML, then load a dataset.

from foundry import Foundry
f = Foundry(index="mdf")


f = f.load("10.18126/e73h-3w6n", globus=True)

NOTE: If you run locally and don't want to install the Globus Connect Personal endpoint, just set the globus=False.

If running this code in a notebook, a table of metadata for the dataset will appear:

metadata

We can use the data with f.load_data() and specifying splits such as train for different segments of the dataset, then use matplotlib to visualize it.

res = f.load_data()

imgs = res['train']['input']['imgs']
desc = res['train']['input']['metadata']
coords = res['train']['target']['coords']

n_images = 3
offset = 150
key_list = list(res['train']['input']['imgs'].keys())[0+offset:n_images+offset]

fig, axs = plt.subplots(1, n_images, figsize=(20,20))
for i in range(n_images):
    axs[i].imshow(imgs[key_list[i]])
    axs[i].scatter(coords[key_list[i]][:,0], coords[key_list[i]][:,1], s = 20, c = 'r', alpha=0.5)

Screen Shot 2022-10-20 at 2 22 43 PM

See full examples

How to Cite

If you find Foundry-ML useful, please cite the following paper

@article{Schmidt2024,
  doi = {10.21105/joss.05467},
  url = {https://doi.org/10.21105/joss.05467},
  year = {2024}, publisher = {The Open Journal},
  volume = {9},
  number = {93},
  pages = {5467},
  author = {Kj Schmidt and Aristana Scourtas and Logan Ward and Steve Wangen and Marcus Schwarting and Isaac Darling and Ethan Truelove and Aadit Ambadkar and Ribhav Bose and Zoa Katok and Jingrui Wei and Xiangguo Li and Ryan Jacobs and Lane Schultz and Doyeon Kim and Michael Ferris and Paul M. Voyles and Dane Morgan and Ian Foster and Ben Blaiszik},
  title = {Foundry-ML - Software and Services to Simplify Access to Machine Learning Datasets in Materials Science}, journal = {Journal of Open Source Software}
}

Contributing

Foundry is an Open Source project and we encourage contributions from the community. To contribute, please fork from the main branch and open a Pull Request on the main branch. A member of our team will review your PR shortly.

Developer notes

In order to enforce consistency with external schemas for the metadata and datacite structures (contained in the MDF data schema repository) the dc_model.py and project_model.py pydantic data models (found in the foundry/jsonschema_models folder) were generated using the datamodel-code-generator tool. In order to ensure compliance with the flake8 linting, the --use-annoted flag was passed to ensure regex patterns in dc_model.py were specified using pydantic's Annotated type vs the soon to be deprecated constr type. The command used to run the datamodel-code-generator looks like:

datamodel-codegen --input dc.json --output dc_model.py --use-annotated

Primary Support

This work was supported by the National Science Foundation under NSF Award Number: 1931306 "Collaborative Research: Framework: Machine Learning Materials Innovation Infrastructure".

Other Support

Foundry-ML brings together many components in the materials data ecosystem. Including MAST-ML, the Data and Learning Hub for Science (DLHub), and the Materials Data Facility (MDF).

MAST-ML

This work was supported by the National Science Foundation (NSF) SI2 award No. 1148011 and DMREF award number DMR-1332851

The Data and Learning Hub for Science (DLHub)

This material is based upon work supported by Laboratory Directed Research and Development (LDRD) funding from Argonne National Laboratory, provided by the Director, Office of Science, of the U.S. Department of Energy under Contract No. DE-AC02-06CH11357. https://www.dlhub.org

The Materials Data Facility

This work was performed under financial assistance award 70NANB14H012 from U.S. Department of Commerce, National Institute of Standards and Technology as part of the Center for Hierarchical Material Design (CHiMaD). This work was performed under the following financial assistance award 70NANB19H005 from U.S. Department of Commerce, National Institute of Standards and Technology as part of the Center for Hierarchical Materials Design (CHiMaD). This work was also supported by the National Science Foundation as part of the Midwest Big Data Hub under NSF Award Number: 1636950 "BD Spokes: SPOKE: MIDWEST: Collaborative: Integrative Materials Design (IMaD): Leverage, Innovate, and Disseminate". https://www.materialsdatafacility.org