-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MAC dependency is too large to deploy it to CH #112
Comments
Hi @art-mep, we are aware of the issue. |
Hi @tbolis-at-mulesoft |
What do you have added to your app? Is it an option to split the app into 2? One only wrapping the connector and exposing the methods you need and the other consuming the first and adding additional dependencies? |
About 5-6 flows of simple logic that use SMTP, GSheets, and HTTP. So no dependencies to get rid of. MUnits are absent. Your solution with a separate app to adopt only the MAC connector looks adoptable even though it is not possible to deploy it to a simple CloudHub. I'll try it with CH2.0. Thanks for your help! |
FYI the MAC Vectors connector for example is today close to 330MB, while by only having the AISearch, Azure Blob Storage and AzureOpenAI dependencies we are close to 230. Even after an eventual refactoring of the mule-ai-chain-connector I don't think we will be able to support CH2.0. |
From MAC operations we are only using calls to OpenAI's GPTs. A solution to include only needed MAC parts would be a great thing. |
The screenshot below represents the size of an exported to deployable jar project with a few flows and only needed dependencies. On the left the MAC connector is added to the pom file and the final size exceeds 351 MB, on the right - the same project without MAC connector has almost 67 MB. Looks like the MAC dependency weights around 290 MB. With that said there is a question: how is it possible to deploy projects with MAC dependency to CloudHub (with an app limit set to 200 MB) or CloudHub 2.0 (with an app limit set to 350 MB)?
The text was updated successfully, but these errors were encountered: