Star

linkFunctions and Methods

Functions and methods play a crucial role in implementing various functionalities in a traditional GenAI application. In jaclang, we have designed these functions and methods to be highly flexible and powerful. Surprisingly, they don't even require a function or method body thanks to the MTLLM by <your_llm> syntax. This section will guide you on how to effectively utilize functions and methods in jaclang using MTLLM.

linkFunctions

Functions/Abilities in jaclang are defined using the can keyword. They can be used to define a set of actions. Normal function looks like this in jaclang:

1linkcan <function_name>(<parameter : parameter_type>, ..) -> <return_type> {

2link <function_body>;

3link}

In a traditional GenAI application, you would make API calls inside the function body to perform the desired action. However, in jaclang, you can define the function using the by <your_llm> syntax. This way, you can define the function without a body and let the MTLLM model handle the implementation. Here is an example:

1linkcan greet(name: str) -> str by <your_llm>();

In the above example, the greet function takes a name parameter of type str and returns a str. The function is defined using the by <your_llm> syntax, which means the implementation of the function is handled by the MTLLM.

Below is an example where we define a function get_expert that takes a question as input and returns the best expert to answer the question in string format using mtllm with openai model with the method Reason. get_answer function takes a question and an expert as input and returns the answer to the question using mtllm with openai model without any method. and we can call these function as normal functions.

1linkimport:py from mtllm.llms, OpenAI;

2link

3linkglob llm = OpenAI(model_name="gpt-4o");

4link

5linkcan get_expert(question: str) -> 'Best Expert to Answer the Question': str by llm(method='Reason');

6linkcan get_answer(question: str, expert: str) -> str by llm();

7link

8linkwith entry {

9link question = "What are Large Language Models?";

10link expert = get_expert(question);

11link answer = get_answer(question, expert);

12link print(f"{expert} says: '{answer}' ");

13link}

Here's another example,

1linkimport:py from mtllm.llms, OpenAI;

2link

3linkglob llm = OpenAI(model_name="gpt-4o");

4link

5linkcan 'Get a Joke with a Punchline'

6linkget_joke() -> tuple[str, str] by llm();

7link

8linkwith entry {

9link (joke, punchline) = get_joke();

10link print(f"{joke}: {punchline}");

11link}

In the above example, the joke_punchline function returns a tuple of two strings, which are the joke and its punchline. The function is defined using the by <your_llm> syntax, which means the implementation is handled by the MTLLM. You can add semstr to the function to make it more specific.

linkMethods

Methods in jaclang are also defined using the can keyword. They can be used to define a set of actions that are specific to a class. Normal method looks like this in jaclang:

1linkobj ClassName {

2link has parameter: parameter_type;

3link can <method_name>(<parameter : parameter_type>, ..) -> <return_type> {

4link <method_body>;

5link }

6link}

In a traditional GenAI application, you would make API calls inside the method body to perform the desired action while using self keyword to get necessary information. However, in jaclang, you can define the method using the by <your_llm> syntax. This way, you can define the method without a body and let the MTLLM model handle the implementation. Here is an example:

1linkobj Person {

2link has name: str;

3link can greet() -> str by <your_llm>(incl_info=(self));

4link}

In the above example, the greet method returns a str. The method is defined using the by <your_llm> syntax, which means the implementation of the method is handled by the MTLLM. The incl_info=(self.name) parameter is used to include the name attribute of the Person object as an information source for the MTLLM.

In the below example, we define a class Essay with a method get_essay_judgement that takes a criteria as input and returns the judgement for the essay based on the criteria using mtllm with openai model after a step of Reasoning. get_reviewer_summary method takes a dictionary of judgements as input and returns the summary of the reviewer based on the judgements using mtllm with openai model. give_grade method takes the summary as input and returns the grade for the essay using mtllm with openai model. and we can call these methods as normal methods.

1linkimport:py from mtllm.llms, OpenAI;

2link

3linkglob llm = OpenAI(model_name="gpt-4o");

4link

5linkobj Essay {

6link has essay: str;

7link

8link can get_essay_judgement(criteria: str) -> str by llm(incl_info=(self.essay));

9link can get_reviewer_summary(judgements: dict) -> str by llm(incl_info=(self.essay));

10link can give_grade(summary: str) -> 'A to D': str by llm();

11link}

12link

13linkwith entry {

14link essay = "With a population of approximately 45 million Spaniards and 3.5 million immigrants,"

15link "Spain is a country of contrasts where the richness of its culture blends it up with"

16link "the variety of languages and dialects used. Being one of the largest economies worldwide,"

17link "and the second largest country in Europe, Spain is a very appealing destination for tourists"

18link "as well as for immigrants from around the globe. Almost all Spaniards are used to speaking at"

19link "least two different languages, but protecting and preserving that right has not been"

20link "easy for them.Spaniards have had to struggle with war, ignorance, criticism and the governments,"

21link "in order to preserve and defend what identifies them, and deal with the consequences.";

22link essay = Essay(essay);

23link criterias = ["Clarity", "Originality", "Evidence"];

24link judgements = {};

25link for criteria in criterias {

26link judgement = essay.get_essay_judgement(criteria);

27link judgements[criteria] = judgement;

28link }

29link summary = essay.get_reviewer_summary(judgements);

30link grade = essay.give_grade(summary);

31link print("Reviewer Notes: ", summary);

32link print("Grade: ", grade);

33link}

linkAbility to Understand Typed Inputs and Outputs

MTLLM is able to represent typed inputs in a way that is understandable to the model. Sametime, this makes the model to generate outputs in the expected output type without any additional information. Here is an example:

1linkimport:py from mtllm.llms, OpenAI;

2link

3linkglob llm = OpenAI(model_name="gpt-4o");

4link

5link

6linkenum 'Personality of the Person'

7linkPersonality {

8link INTROVERT: 'Person who is shy and reticent' = "Introvert",

9link EXTROVERT: 'Person who is outgoing and socially confident' = "Extrovert"

10link}

11link

12linkobj 'Person'

13linkPerson {

14link has full_name: 'Fullname of the Person': str,

15link yod: 'Year of Death': int,

16link personality: 'Personality of the Person': Personality;

17link}

18link

19linkcan 'Get Person Information use common knowledge'

20linkget_person_info(name: 'Name of the Person': str) -> 'Person': Person by llm();

21link

22linkwith entry {

23link person_obj = get_person_info('Martin Luther King Jr.');

24link print(person_obj);

25link}

1link# Output

2linkPerson(full_name='Martin Luther King Jr.', yod=1968, personality=Personality.INTROVERT)

In the above example, the get_person_info function takes a name parameter of type str and returns a Person object. The Person object has three attributes: full_name of type str, yod of type int, and personality of type Personality. The Personality enum has two values: INTROVERT and EXTROVERT. The function is defined using the by <your_llm> syntax, which means the implementation is handled by the MTLLM. The model is able to understand the typed inputs and outputs and generate the output in the expected type.

magic

Functions and MethodsFunctionsMethodsAbility to Understand Typed Inputs and Outputs

Home

Quick Startchevron_right
Building Blockschevron_right
Tutorialschevron_right
API Referencechevron_right
Tips and Trickschevron_right

FAQs