Dllama

Dllama

A simple and easy to use library for doing local LLM inference directly from your favorite programming language.

What is Dllama?

A simple and easy to use library for doing local LLM inference directly from your favoriate programming language. It can load GGUF formatted LLMs into CPU or GPU memory. Uses Vulkan back end for acceleration.

Integrate Large Language Model (LLM) AI support to enhance your applications with advanced natural language processing and understanding capabilities. Users will benefit from accurate and sophisticated responses, enabling complex language tasks like generation, question answering, sentiment analysis, and translation. Improve user engagement, satisfaction, and overall functionality by leveraging LLM for a highly intelligent and responsive user experience.

Benefits

Cost-effectiveness

Deploying a local Large Language Model (LLM) can be cost-effective by reducing API, data transfer, and storage costs, allowing better control over resources, enhancing data privacy, and providing long-term financial benefits through customization and efficiency.

Ease of use

We prioritize a user-friendly integration process, ensuring developers can easily incorporate our AI capabilities into their applications without facing unnecessary complexity or technical challenges.

Focus on supporting indie developers

Our service is specifically designed to cater to the unique needs and constraints of indie developers. We strive to empower them with access to high-quality language models while keeping costs minimal.

Affordable access to LLM

By leveraging high-quality large language models, we provide developers with access to cutting-edge AI capabilities at a fraction of the cost compared to other options. This allows developers to utilize state-of-the-art technology without incurring exorbitant expenses.

Seamless app AI integration without high expenses

Our service relieves the burden of high expenses, enabling developers to seamlessly integrate AI into their applications. With affordable access to our service, developers can focus on enhancing their app's functionality instead of worrying about costly infrastructure.

Enhancing Data Privacy

Using a local Large Language Model (LLM) enhances privacy by keeping sensitive data on-premises, reducing the risk of data breaches associated with external cloud services.

Features

Simple Example

Delphi


uses
  System.SysUtils,
  Dllama;

begin
  // init
  if not Dllama_Init('config.json', nil) then
    Exit;
  try
    // add message
    Dllama_AddMessage(ROLE_SYSTEM, 'You are a helpful AI assistant');
    Dllama_AddMessage(ROLE_USER, 'What is AI?');
    
    // do inference
    if Dllama_Inference('phi3', 1024, nil, nil, nil) then
    begin
      // success
    end
  else
    begin
      // error
    end;
  finally
    Dllama_Quit();
  end;
end.
                    

Media

Inference

License

License: BSD 3-Clause License

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice, this
   list of conditions and the following disclaimer.

2. Redistributions in binary form must reproduce the above copyright notice,
   this list of conditions and the following disclaimer in the documentation
   and/or other materials provided with the distribution.

3. Neither the name of the copyright holder nor the names of its
   contributors may be used to endorse or promote products derived from
   this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

If this project is useful to you, consider starring the repo, sponsoring it, spreading the word, etc. Any help is greatly welcomed and appreciated.

Contact

We hope you find this product useful.
Feel free to get in touch if you have any questions or suggestions.

Need help with using the Dllama, reach out to use via the links below:

Issues
Discussions
Discord
tinyBigGAMES™ LLC
Download & Play!

Get Connected

Copyright © 2024-present by tinyBigGAMES™ LLC, All Rights Reserved.