You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I attempted to run the advertised demo on the Modular site for the Mistral pipeline but receive the following output stating the mistral command does not exist:
I have a potential fix in a fork but I do not believe I have it configured correctly. I get a bf16 datatype error on my Macbook Pro M3 Max. I'm fairly new in the AI world and assume it's some missing transform config?
error: The bf16 data type is not supported on device 'cpu:0'.
@main.command(name="mistral")@pipeline_config_options@common_server_options@click.option("--prompt",type=str,default="Why is the sky blue?",help="The text prompt to use for further generation.",)@click.option("--num-warmups",type=int,default=0,show_default=True,help="# of warmup iterations to run before the final timed run.",)@click.option("--serve",type=bool,default=False,is_flag=True,show_default=True,help="Whether to serve an OpenAI HTTP endpoint on port 8000.",)defrun_mistral(
prompt,
num_warmups,
serve,
profile_serve,
performance_fake,
batch_timeout,
model_name,
**config_kwargs,
):
"""Runs the Mistral pipeline."""# Update basic parameters.ifconfig_kwargs["architecture"] isNone:
config_kwargs["architecture"] ="MistralForCausalLM"ifconfig_kwargs["architecture"] !="MistralForCausalLM":
msg= (
f"provided architecture '{config_kwargs['architecture']}' not"" compatible with Mistral."
)
raiseValueError(msg)
config_kwargs["trust_remote_code"] =Trueconfig=PipelineConfig(**config_kwargs)
# if config.quantization_encoding not in [# SupportedEncoding.bfloat16# ]:# config.cache_strategy = KVCacheStrategy.NAIVEifserve:
serve_pipeline(
pipeline_config=config,
profile=profile_serve,
performance_fake=performance_fake,
batch_timeout=batch_timeout,
model_name=model_name,
)
else:
generate_text_for_pipeline(
pipeline_config=config, prompt=prompt, num_warmups=num_warmups
)
Steps to reproduce
Include relevant code snippet or link to code that did not work as expected.
If applicable, add screenshots to help explain the problem.
Include anything else that might help us debug the issue.
Simply follow the steps outlined here: Mistral NeMo
- What OS did you do install MAX on ?
macOS Sequoia 15.2
- Provide version information for MAX by pasting the output of max -v`- Provide version information for Mojo by pasting the output of mojo -v`
- Provide Magic CLI version by pasting the output of `magic -v`
magic 0.6.2 - (based on pixi 0.40.0)
The text was updated successfully, but these errors were encountered:
Bug description
I attempted to run the advertised demo on the Modular site for the Mistral pipeline but receive the following output stating the
mistral
command does not exist:I have a potential fix in a fork but I do not believe I have it configured correctly. I get a
bf16
datatype error on my Macbook Pro M3 Max. I'm fairly new in the AI world and assume it's some missing transform config?error: The bf16 data type is not supported on device 'cpu:0'.
Steps to reproduce
Simply follow the steps outlined here:
Mistral NeMo
Review the code file for the pipeline definitions and see the command definition is not there:
https://github.com/modular/max/blob/main/pipelines/python/pipelines.py#L141-L266
System information
The text was updated successfully, but these errors were encountered: