The operation failed with an error. [400] Invalid model ‘llama-3-8b-instruct’. Permitted models can be found in the documentation at Home - Perplexity.
How is that possible if it is actually permitted?
The same happened with Mixtral, and now it is happening with Llama, why?
Also, this drives me crazy, why every time I open a scenario to edit it ALL modules are cloned and duplicated automatically, each one above the other??
SOLUTION: In the perplexity module settings, instead of using a model in the dropdown, click the “Map” slide button and just paste in: llama-3.1-sonar-large-128k-online or whatever one you want to use, then it seems to work fine.