What is your goal?
onnect GoHighLevel (GHL) SMS replies to OpenAI via Make so that:
When a contact replies via SMS in GHL
The message is sent to OpenAI through a Make webhook
OpenAI generates a response
Make sends the response back to the same contact via the LeadConnector Conversations API
In short: build a fully automated two-way AI SMS conversation between GHL and OpenAI using Make.
What is the problem & what have you tried?
What works:
GHL successfully triggers a webhook on inbound SMS
Make receives the webhook payload
OpenAI generates a valid response (confirmed in OpenAI module output)
The OpenAI response is visible in Make execution logs
What does NOT work:
When Make sends the OpenAI response back to GHL via HTTP (POST to LeadConnector Conversations API), GHL returns: 422 Unprocessable Entity
“There is no message or attachments for this message. Skip sending.”
Error messages or input/output bundles
{
“status”: 422,
“message”: “There is no message or attachments for this message. Skip sending.”,
“name”: “HttpException”
}
This is what I have inputed-
{
“type”: “SMS”,
“contactId”: “{{1.contact_id}}”,
“message”: “{{3.output.content.text}}”
}
Additional context (important)
This is a real-time SMS workflow (timing-sensitive)
The issue appears related to how Make handles OpenAI output fields vs. JSON string serialization
Possible race condition, output truncation, or incorrect field reference when OpenAI returns a response object instead of a simple string
Suspecting either:
Incorrect OpenAI output field to map or
A Make bug with OpenAI module output + HTTP JSON string
What help is needed
Confirmation of the correct OpenAI output field to map for HTTP SMS sending
Best-practice pattern for GHL ↔ Make ↔ OpenAI two-way SMS
Whether this is a known Make + OpenAI module issue
Recommended workaround if OpenAI output must be transformed before sending to GHL
