OpenAI
                                
                                
                                
                                    OpenAI copied to clipboard
                            
                            
                            
                        :alien: (Compatability) Make model optional to allow using TogetherAI endpoint
What
TogetherAI's API mocks OpenAI's, however the model is not returned in chat stream responses, thus it must be made optional in order to proceed without erroring.
Simply added a ? flag to make model optional.
Enables you to use the TogetherAI API like so:
let configuration = OpenAI.Configuration(token: "NOTYOURTOKEN", host: "api.together.xyz", timeoutInterval: 10)
let ai = OpenAI(configuration: configuration)
Affected Areas
ChatStreamResponse.swift
Does this fall under this scope-creep? This project specifically implements the OpenAI API.
It's an extremely minor change, and most of the inference providers intentionally clone the OpenAI API in order to be compatible with project such as this. I have submitted a ticket to them regarding this issue as well, but this solution will be more robust.
I've now added support for paths in the model host config, enabling Groq API support (api.groq.com/openai)
extension OpenAI {
    func buildURL(path: String) -> URL {
        var components = URLComponents()
        components.scheme = "https"
        // Check if host contains a path component
        let hostParts = configuration.host.split(separator: "/", maxSplits: 1, omittingEmptySubsequences: true)
        if hostParts.count > 1 {
            // Host contains a path component
            components.host = String(hostParts[0])
            let hostPath = "/" + hostParts[1]
            components.path = hostPath + path
        } else {
            // Host does not contain a path component
            components.host = configuration.host
            components.path = path
        }
        guard let url = components.url else {
            fatalError("Unable to construct URL from components.")
        }
        return url
    }
}
                                    
                                    
                                    
                                
 Quality Gate passed
Issues
 2 New issues
 0 Accepted issues
Measures
 0 Security Hotspots
 No data about Coverage
 No data about Duplication
Any update on this I'm interested in this because many models can be use this way, Grow, TogetherAI, Huggingface, Azure, etc
It works yeah, I use it with Groq, TogetherAI, all the LiteLLM supported models (Anthropic mostly). My fork will be updated to maintain that as well as OpenAI, unsure if they will merge this into master here.
On Fri, May 03, 2024 at 11:52 AM, Danielh Carranza < @.*** > wrote:
Any update on this I'm interested in this because many models can be use this way, Grow, TogetherAI, Huggingface, Azure, etc
— Reply to this email directly, view it on GitHub ( https://github.com/MacPaw/OpenAI/pull/177#issuecomment-2093391327 ) , or unsubscribe ( https://github.com/notifications/unsubscribe-auth/AGZNIXEJ5MJZVPCAN5N5RHTZAO6DRAVCNFSM6AAAAABDR7G3HCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOJTGM4TCMZSG4 ). You are receiving this because you authored the thread. Message ID: <MacPaw/OpenAI/pull/177/c2093391327 @ github. com>
 Quality Gate passed
Issues
 8 New issues
 0 Accepted issues
Measures
 0 Security Hotspots
 No data about Coverage
 No data about Duplication
branching far away from the original scope of this project now