humanify icon indicating copy to clipboard operation
humanify copied to clipboard

Add better local model support for LM Studio + document current workaround using `humanify openai --baseURL`

Open 0xdevalias opened this issue 8 months ago • 0 comments

  • https://lmstudio.ai/
    • Your local AI toolkit.

    • Download and run Llama, DeepSeek, Mistral, Phi on your computer.

    • https://lmstudio.ai/models
      • Model Catalog

    • https://lmstudio.ai/docs/app
      • Docs

      • https://lmstudio.ai/docs/app/api
        • LM Studio as a Local LLM API Server

          You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.

          LM Studio's APIs can be used through an OpenAI compatibility mode, enhanced REST API, or through a client library like lmstudio-js.

        • https://lmstudio.ai/docs/app/api/endpoints/openai
          • OpenAI Compatibility API

        • https://lmstudio.ai/docs/app/api/endpoints/rest
          • LM Studio REST API (beta)

          • LM Studio now has its own REST API, in addition to OpenAI compatibility mode.

            The REST API includes enhanced stats such as Token / Second and Time To First Token (TTFT), as well as rich information about models such as loaded vs unloaded, max context, quantization, and more.

        • https://lmstudio.ai/docs/app/api/structured-output
          • Structured Output

            You can enforce a particular response format from an LLM by providing a JSON schema to the /v1/chat/completions endpoint, via LM Studio's REST API (or via any OpenAI client).

      • https://lmstudio.ai/docs/typescript
        • lmstudio-js (TypeScript SDK)

          The SDK provides you a set of programmatic tools to interact with LLMs, embeddings models, and agentic flows.

          • https://github.com/lmstudio-ai/lmstudio-js
            • LM Studio TypeScript SDK

            • lmstudio-ts is LM Studio's official JavaScript/TypeScript client SDK, it allows you to

            • https://github.com/lmstudio-ai/lmstudio-js#why-use-lmstudio-js-over-openai-sdk
              • Why use lmstudio-js over openai sdk?

                Open AI's SDK is designed to use with Open AI's proprietary models. As such, it is missing many features that are essential for using LLMs in a local environment, such as:

                • Managing loading and unloading models from memory
                • Configuring load parameters (context length, gpu offload settings, etc.)
                • Speculative decoding
                • Getting information (such as context length, model size, etc.) about a model
                • ... and more

                In addition, while openai sdk is automatically generated, lmstudio-js is designed from ground-up to be clean and easy to use for TypeScript/JavaScript developers.

See Also

  • https://github.com/jehna/humanify/issues/400
  • https://github.com/jehna/humanify/issues/84
  • https://github.com/jehna/humanify/issues/14
  • https://github.com/jehna/humanify/issues/416

0xdevalias avatar Apr 26 '25 04:04 0xdevalias