Lab for Ilmu dasar LLM dengan Langchain
In [1]:
Copied!
import json
import json
In [2]:
Copied!
from langchain_ollama.llms import OllamaLLM
ollamaModel = "llama3.2:latest"
model = OllamaLLM(model=ollamaModel)
from langchain_ollama.llms import OllamaLLM
ollamaModel = "llama3.2:latest"
model = OllamaLLM(model=ollamaModel)
In [3]:
Copied!
return_value = model.invoke("Assalamualaikum")
print(return_value)
return_value = model.invoke("Assalamualaikum")
print(return_value)
Wa alaykums salam. How can I assist you today?
Konversasi menggunakan chat model¶
In [4]:
Copied!
from langchain_core.messages import HumanMessage, AIMessage
from langchain_ollama.chat_models import ChatOllama
chat_model = ChatOllama(model=ollamaModel)
from langchain_core.messages import HumanMessage, AIMessage
from langchain_ollama.chat_models import ChatOllama
chat_model = ChatOllama(model=ollamaModel)
In [5]:
Copied!
human_message = HumanMessage("Nama saya farras")
return_value_chage_model = chat_model.invoke([human_message])
print(type(return_value_chage_model))
print(json.dumps(return_value_chage_model.model_dump(), indent=4))
human_message = HumanMessage("Nama saya farras")
return_value_chage_model = chat_model.invoke([human_message])
print(type(return_value_chage_model))
print(json.dumps(return_value_chage_model.model_dump(), indent=4))
<class 'langchain_core.messages.ai.AIMessage'> { "content": "Selamat pagi/siangat, Farras! Bagaimana kabar Anda hari ini?", "additional_kwargs": {}, "response_metadata": { "model": "llama3.2:latest", "created_at": "2025-09-06T09:45:19.8095877Z", "done": true, "done_reason": "stop", "total_duration": 456539500, "load_duration": 153339100, "prompt_eval_count": 30, "prompt_eval_duration": 16168200, "eval_count": 21, "eval_duration": 284049200, "model_name": "llama3.2:latest" }, "type": "ai", "name": null, "id": "run--af8e3ec2-f47c-444d-91a3-aede787d86c8-0", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 30, "output_tokens": 21, "total_tokens": 51 } }
Build simple class to using chat models¶
In [6]:
Copied!
from ChatModelsSimplifier import MyLLMChatModel
my_simple_model = MyLLMChatModel("llama3.2:latest")
from ChatModelsSimplifier import MyLLMChatModel
my_simple_model = MyLLMChatModel("llama3.2:latest")
In [7]:
Copied!
my_simple_model.chat_models("Halo nama saya muhammad farras ma'ruf")
my_simple_model.chat_models("Halo nama saya muhammad farras ma'ruf")
Senang bertemu dengan Muhammad Farras Ma'Ruf XXX
In [8]:
Copied!
my_simple_model.chat_models("Saya punya tiga anak, 2 laki-laku, dan satu perempuan")
my_simple_model.chat_models("Saya punya tiga anak, 2 laki-laku, dan satu perempuan")
Jika Anda memiliki 3 anak, maka keluarga Anda pasti sangat bahagia dengan anggota baru yang sedang berusia XXX
In [9]:
Copied!
my_simple_model.chat_models("Siapa nama saya ?")
my_simple_model.chat_models("Siapa nama saya ?")
Anda bernama Muhammad Farras Ma'Ruf XXX
In [10]:
Copied!
print(my_simple_model.list_of_chats)
print(my_simple_model.list_of_chats)
[SystemMessage(content='You are assitant named NoSys, you always answert the question with sufix three XXX.', additional_kwargs={}, response_metadata={}), HumanMessage(content="Halo nama saya muhammad farras ma'ruf", additional_kwargs={}, response_metadata={}), AIMessage(content="Senang bertemu dengan Muhammad Farras Ma'Ruf XXX", additional_kwargs={}, response_metadata={'model': 'llama3.2:latest', 'created_at': '2025-09-06T09:45:20.6688002Z', 'done': True, 'done_reason': 'stop', 'total_duration': 414237400, 'load_duration': 188611600, 'prompt_eval_count': 58, 'prompt_eval_duration': 19896400, 'eval_count': 14, 'eval_duration': 204529200, 'model_name': 'llama3.2:latest'}, id='run--d5df3106-3a88-4f63-b3a9-ebda48431225-0', usage_metadata={'input_tokens': 58, 'output_tokens': 14, 'total_tokens': 72}), HumanMessage(content='Saya punya tiga anak, 2 laki-laku, dan satu perempuan', additional_kwargs={}, response_metadata={}), AIMessage(content='Jika Anda memiliki 3 anak, maka keluarga Anda pasti sangat bahagia dengan anggota baru yang sedang berusia XXX', additional_kwargs={}, response_metadata={'model': 'llama3.2:latest', 'created_at': '2025-09-06T09:45:21.2900196Z', 'done': True, 'done_reason': 'stop', 'total_duration': 610637200, 'load_duration': 156224700, 'prompt_eval_count': 100, 'prompt_eval_duration': 8874500, 'eval_count': 31, 'eval_duration': 443099000, 'model_name': 'llama3.2:latest'}, id='run--2a5e83fb-815d-4f5c-9204-dcf27af7a8a7-0', usage_metadata={'input_tokens': 100, 'output_tokens': 31, 'total_tokens': 131}), HumanMessage(content='Siapa nama saya ?', additional_kwargs={}, response_metadata={}), AIMessage(content="Anda bernama Muhammad Farras Ma'Ruf XXX", additional_kwargs={}, response_metadata={'model': 'llama3.2:latest', 'created_at': '2025-09-06T09:45:21.6142554Z', 'done': True, 'done_reason': 'stop', 'total_duration': 317733900, 'load_duration': 141230500, 'prompt_eval_count': 145, 'prompt_eval_duration': 9308300, 'eval_count': 12, 'eval_duration': 165639200, 'model_name': 'llama3.2:latest'}, id='run--e97b0621-bbd0-41f9-8b64-7c74e1a0d37f-0', usage_metadata={'input_tokens': 145, 'output_tokens': 12, 'total_tokens': 157})]
Membuat MML yang dapat digunakan kembalim¶
In [ ]:
Copied!