Results 1 to 4 of 4

Thread: Does exist wrapper (or bindings) for llama.cpp coded in thinBasic? (module suggest)

  1. #1

    Does exist wrapper (or bindings) for llama.cpp coded in thinBasic? (module suggest)

    Hi. I'm searching bindings/wrapper for llama.cpp coded in any dialect of Basic or similiar easy-to-learn language. Does anybody know something like that? If there is no such module in thinBasic why not to create it? This is a popular AI technology that can promote thinBasic. Thanks for your attention.

  2. #2
    thinBasic author ErosOlmi's Avatar
    Join Date
    Sep 2004
    Milan - Italy
    Rep Power

    are you referring to this:

    Very interesting, and lot to study

    Personally I'm just an OpenAi standard user using it to study AI possibilities and get some ideas.

    At work we used some AI features from Microsoft Azure Cognitive Services for automating some boring process.
    We used Sentiment Analysis and Text Classification to verify and classify users feedback on products purchasing integrating our ecommerce web site that is hosted under AWS with company backend databases using Azure Functions as a glue to integrate the two worlds.
    And it is working pretty well ... so fare more than 200k customers recensions have been analyzed and integrated.

    Will see.
    I've downloaded and installed LM Studio from to get some practice on spare time.

    Anything to suggest on how to proceed?

    Eros | |
    Windows 10 Pro for Workstations 64bit - 32 GB - Intel(R) Xeon(R) W-10855M CPU @ 2.80GHz - NVIDIA Quadro RTX 3000

  3. #3
    Quote Originally Posted by ErosOlmi View Post

    Anything to suggest on how to proceed?

    As far as I know Pascal is more easy to understand than C++ so i would recommend to study Pascal wrapper of llama.cpp. This file is Pascal port of C++ So this is code of a simple console app that can chat offline with AI models in .gguf format: It uses llama.pas bindings to call functions from llama.dll. Llama.dll can be obtained from Neurochat installer: After installing Neurochat you have to open it's folder and there you will see llama.dll. Then you should copy llama.dll to the folder where you have already put llama.pas, llama_cli.lpr and llama_cli.lpi (all these files can be found in this repo: After that you have to download and install free Lazarus IDE: Then open llama_cli.lpr in Lazarus and compile it. Now you will have llama_cli.exe that uses llama.dll to chat with files of AI models. An example of local AI model: To run it with llama_cli.exe you will need more than 8gb RAM. If you have less RAM then try this version: Other AI models can be found here:

    If you don't like Pascal you can study other bindings/wrappers: C#/.NET -, JavaScript/Wasm (works in browser) -

    Also you can study one-file C implementation of offline AI chatbot: This project also has bindings/wrappers: C# -, JavaScript - Other bindings:

    Recently i discovered a Harbour wrapper to llama.cpp: The code of it's minimal console AI chatbot looks rather simple: A liitle bit more complicated than code of an average Basic dialect but still understandable. But the problem is that this console chatbot calls functions from hllama.cpp So Harbour wrapper is written in C++ and it's quite hard to study it if you are a newbie in coding. However i compiled llama.lib from Harbour repo and now have some questions to you and other expirienced coders in this forum. Can thinBasic call functions from .lib files compiled in MSVC? Or maybe thinBasic has it's own format of .lib files? If so, does exist any way to convert MSVC .lib to thinBasic .lib? Can MSVC .lib be converted to .dll file? If yes, then AI chatting functions can be called from this .dll and so will become acceptable from thinBasic code (using some link command), right? I also have compiled llama.obj, hllama.obj and other .obj files. And here come the same questions: Can thinBasic use MSVC .obj files? Do they need to be converted to native thinBasic format? Does exist a way to make a .dll from .obj files? I can share llama.lib, hllama.obj and other files. So just notify me if you need them.

    One last thing. I talked to coder Cory Smith (founder of He converted code of run.c (from llama2.c project mentioned earlier) to VB.NET. I'm not a fan of VB.NET, i prefer more simple and lightweight Basic dialects. But this converted code may be useful for you:
    Last edited by JohnClaw; 18-04-2024 at 16:27.

  4. #4
    I possibly found simplified version of llama.cpp:

    Minimal code to chat with llms:


    LResponse: string;
    LTokenInputSpeed: Single;
    LTokenOutputSpeed: Single;
    LInputTokens: Integer;
    LOutputTokens: Integer;
    LTotalTokens: Integer;

    // init config
    Dllama_InitConfig('C:\LLM\gguf', -1, False, VK_ESCAPE);

    // add model
    Dllama_AddModel('Meta-Llama-3-8B-Instruct-Q6_K', 'llama3', 1024*8, '<|start_header_id|>%s %s<|end_header_id|>',
    '\n assistant:\n', ['<|eot_id|>', 'assistant']);

    // add messages
    Dllama_AddMessage(ROLE_SYSTEM, 'you are Dllama, a helpful AI assistant.');
    Dllama_AddMessage(ROLE_USER, 'who are you?');

    // display the user prompt
    Dllama_Console_PrintLn(Dllama_GetLastUserMessage(), [], DARKGREEN);

    // do inference
    if Dllama_Inference('llama3', LResponse) then
    // display usage
    Dllama_Console_PrintLn(CRLF, [], WHITE);
    Dllama_GetInferenceUsage(@LTokenInputSpeed, @LTokenOutputSpeed, @LInputTokens, @LOutputTokens,
    Dllama_Console_PrintLn('Tokens :: Input: %d, Output: %d, Total: %d, Speed: %3.1f t/s',
    [LInputTokens, LOutputTokens, LTotalTokens, LTokenOutputSpeed], BRIGHTYELLOW);
    Dllama_Console_PrintLn('Error: %s', [Dllama_GetError()], RED);

    I talked to author of Dllama. He will help me to create bindings for BCX or even make it by himself. If you are interested to create bindings for ThinBasic, join Dllama discord:

Similar Threads

  1. Sigil Wrapper
    By LeftyG in forum Sources, Templates, Code Snippets, Tips and Tricks, Do you know ...
    Replies: 1
    Last Post: 12-02-2022, 18:29
  2. MCI Video Wrapper
    By Petr Schreiber in forum Experimental modules or library interface
    Replies: 20
    Last Post: 17-07-2020, 20:55
  3. OpenB3D Engine Wrapper for ThinBasic (open testing)
    By Artemka in forum TBGL Scripts and Projects
    Replies: 4
    Last Post: 12-03-2013, 23:38
  4. For Dan - Do we even exist?
    By LanceGary in forum Shout Box Area
    Replies: 5
    Last Post: 24-03-2011, 07:43
  5. Replies: 5
    Last Post: 18-03-2010, 08:28

Members who have read this thread: 3

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts