Lua Trainable RAG Local Chatbot Library, Local Model Aligns Large Model, Geometry+Worded Math Solver,Math/Geometry Data Generator, Chat Database

For comparison this method is similar to how google ranks search results except this uses synonymous words and phrases to expand its understanding. After looking into neural networks it appears the neural networks determine their own algorithm. So coming up with a concrete human made algorithm is not an invalid approach. The main issue is combining knowledge and applying concepts. AI works in a very similar manner. But the tables are modified by the human responses. Anyway I wanted to contribute some more code to this project So I am releasing my wisdom database I use for my bot. This will greatly increases the responses you would recieve from the bot. The contents of this database is educational in nature about the universe and quotes from Sun Szus art of war.

wisdom={" Knowledge is the key to victory."," Do not let the darkness stop you on your quest."," If you have any more questions, feel free to ask."," I've been studying this world."," I'm sure you many questions will reveal themselves to you in time."," One of the best ways to learn is to try things out yourself. Explore as much as you can! You will find many mysteries to unravel.",
						" Adapt to the situation and you will find the way to victory.", " Do not let fear hold you back from your destiny.", " If you seek more wisdom, do not hesitate to ask.", " I have learned much from this world.", " I trust that you will discover many secrets in due time.", " The best teacher is experience. Go and explore the wonders of this world! You will encounter many challenges and opportunities.", " ", " Flow like water and adjust to the terrain you face.", " Achieve success by aligning yourself with the enemy’s intentions.", " Harness your energy and release it at the right moment.", " Emotions are fleeting, but consequences are lasting. A ruined kingdom cannot be restored, nor can the dead be revived.", " If you can provoke your enemy’s anger, you can make him lose his composure. Pretend to be weak, so he will become overconfident.", " In war, deception is your ally and you will prevail.", " If the enemy exposes a weakness, you must exploit it.", " We cannot form alliances until we know the plans of our neighbors.", " The seasoned soldier is always prepared and confident; he knows what to do before and after the battle.", " If you know your enemy and yourself, your victory is certain.", " The leader who excels in defense is hidden and secure; the leader who excels in attack is swift and surprising.", " The supreme leader follows the moral law, and adheres to method and discipline; thus he has the power to control success.", " Fake disorder implies perfect order; fake fear implies courage; fake weakness implies strength.", " Whoever arrives first and waits for the enemy, will be ready for the fight; whoever arrives second and rushes to battle, will be weary and exhausted.", " The quality of decision is like the swift strike of a falcon that destroys its prey."," Subdue your enemy without fighting.", " Deceive your enemy in warfare.", " Keep your plans secret and strike like lightning.", " Adapt to the situation and you will find the way to victory.", " Do not let fear hold you back from your destiny.", " If you seek more wisdom, do not hesitate to ask.", " I have learned much from this world.", " I trust that you will discover many secrets in due time.", " The best teacher is experience. Go and explore the wonders of this world! You will encounter many challenges and opportunities.", " ", " Flow like water and adjust to the terrain you face.", " Achieve success by aligning yourself with the enemy’s intentions.", " Harness your energy and release it at the right moment.", " Emotions are fleeting, but consequences are lasting. A ruined kingdom cannot be restored, nor can the dead be revived.", " If you can provoke your enemy’s anger, you can make him lose his composure. Pretend to be weak, so he will become overconfident.", " In war, deception is your ally and you will prevail.", " If the enemy exposes a weakness, you must exploit it.", " We cannot form alliances until we know the plans of our neighbors.", " The seasoned soldier is always prepared and confident; he knows what to do before and after the battle.", " If you know your enemy and yourself, your victory is certain.", " The leader who excels in defense is hidden and secure; the leader who excels in attack is swift and surprising.", " The supreme leader follows the moral law, and adheres to method and discipline; thus he has the power to control success.", " Fake disorder implies perfect order; fake fear implies courage; fake weakness implies strength.", " Whoever arrives first and waits for the enemy, will be ready for the fight; whoever arrives second and rushes to battle, will be weary and exhausted.", " The quality of decision is like the swift strike of a falcon that destroys its prey."," The universe is made of matter that makes us and light that sustains us.", " Dark energy rules the universe and it is expanding every day.", " Only 5% of the universe is made of atoms.", " Nearly three fourths of the universe is dark energy.", " In the midst of chaos, there is also opportunity.", " Quasars are the brightest objects in the universe. They are powered by supermassive black holes that devour matter and emit enormous amounts of energy.", " Pulsars are rapidly spinning neutron stars that emit beams of radiation like cosmic lighthouses. They can be used as precise clocks to measure time and space.", " Magnetars are a rare type of neutron star with extremely powerful magnetic fields. They can generate intense bursts of gamma rays and x-rays that can affect the whole galaxy.", " Black holes are regions of space where gravity is so strong that nothing can escape, not even light. They can warp space and time around them and create gravitational waves.", " Planets are celestial bodies that orbit a star and have enough mass to be spherical and clear their neighborhood of other objects. There are eight planets in our solar system, each with unique features and characteristics.", " Stars are massive balls of plasma that shine by nuclear fusion in their cores. They come in different sizes, colors, temperatures, and lifetimes. Our sun is a medium-sized yellow star that has been burning for about 4.6 billion years.", " Dwarf stars are small stars that have low mass and luminosity. They can be white, red, or brown, depending on their temperature and composition. White dwarfs are the remnants of dead stars that have shed their outer layers. Red dwarfs are the most common type of star in the galaxy and can live for trillions of years. Brown dwarfs are failed stars that never ignited fusion in their cores.", " The force which binds us to the Earth is gravity, which is proportional to the product of our mass and the Earth’s mass, and inversely proportional to the square of the distance between our centers.", " The revolution of the Moon around the Earth is due to gravity, which provides the centripetal force that keeps the Moon in its orbit.", " The revolution of planets around the Sun is also due to gravity, which balances the tangential velocity of the planets and prevents them from flying away.", " Tide arising due to the Moon and the Sun is caused by gravity, which varies with distance and creates a difference in gravitational pull on different parts of the Earth.", " Newton’s law of universal gravitation is a general physical law derived from empirical observations by Isaac Newton. It states that every particle attracts every other particle in the universe with a force that is proportional to the product of their masses and inversely proportional to the square of the distance between their centers.", " Newton’s law of universal gravitation has been superseded by Albert Einstein’s theory of general relativity, but it still continues to be used as an excellent approximation of the effects of gravity in most applications."," The universe is made of matter that makes us and light that sustains us.",
" Dark energy rules the universe and it is expanding every day.",
" Only 5% of the universe is made of atoms." ,
" Nearly three fourths of the universe is dark energy.",
" In the midst of chaos, there is also opportunity.",
						" Quasars are the brightest objects in the universe. They are powered by supermassive black holes that devour matter and emit enormous amounts of energy.", " Pulsars are rapidly spinning neutron stars that emit beams of radiation like cosmic lighthouses. They can be used as precise clocks to measure time and space.", " Magnetars are a rare type of neutron star with extremely powerful magnetic fields. They can generate intense bursts of gamma rays and x-rays that can affect the whole galaxy.", " Black holes are regions of space where gravity is so strong that nothing can escape, not even light. They can warp space and time around them and create gravitational waves.", " Planets are celestial bodies that orbit a star and have enough mass to be spherical and clear their neighborhood of other objects. There are eight planets in our solar system, each with unique features and characteristics.", " Stars are massive balls of plasma that shine by nuclear fusion in their cores. They come in different sizes, colors, temperatures, and lifetimes. Our sun is a medium-sized yellow star that has been burning for about 4.6 billion years.", " Dwarf stars are small stars that have low mass and luminosity. They can be white, red, or brown, depending on their temperature and composition. White dwarfs are the remnants of dead stars that have shed their outer layers. Red dwarfs are the most common type of star in the galaxy and can live for trillions of years. Brown dwarfs are failed stars that never ignited fusion in their cores."," The universe is all of space and time and their contents, including planets, stars, galaxies, and all other forms of matter and energy. The Big Bang theory is the prevailing cosmological description of the development of the universe. According to this theory, space and time emerged together 13.787±0.020 billion years ago, and the universe has been expanding ever since the Big Bang."," While the spatial size of the entire universe is unknown, it is possible to measure the size of the observable universe, which is approximately 93 billion light-years in diameter at the present day."," Some of the earliest cosmological models of the universe were developed by ancient Greek and Indian philosophers and were geocentric, placing Earth at the center."," Over the centuries, more precise astronomical observations led Nicolaus Copernicus to develop the heliocentric model with the Sun at the center of the Solar System. In developing the law of universal gravitation, Isaac Newton built upon Copernicus's work as well as Johannes Kepler's laws of planetary motion and observations by Tycho Brahe."," Further observational improvements led to the realization that the Sun is one of a few hundred billion stars in the Milky Way, which is one of a few hundred billion galaxies in the observable universe. Many of the stars in a galaxy have planets. At the largest scale, galaxies are distributed uniformly and the same in all directions, meaning that the universe has neither an edge nor a center. At smaller scales, galaxies are distributed in clusters and superclusters which form immense filaments and voids in space, creating a vast foam-like structure. Discoveries in the early 20th century have suggested that the universe had a beginning and that space has been expanding since then at an increasing rate."," According to the Big Bang theory, the energy and matter initially present have become less dense as the universe expanded. After an initial accelerated expansion called the inflationary epoch at around 10−32 seconds, and the separation of the four known fundamental forces, the universe gradually cooled and continued to expand, allowing the first subatomic particles and simple atoms to form. Dark matter gradually gathered, forming a foam-like structure of filaments and voids under the influence of gravity. Giant clouds of hydrogen and helium were gradually drawn to the places where dark matter was most dense, forming the first galaxies, stars, and everything else seen today."," From studying the movement of galaxies, it has been discovered that the universe contains much more matter than is accounted for by visible objects; stars, galaxies, nebulas and interstellar gas. This unseen matter is known as dark matter (dark means that there is a wide range of strong indirect evidence that it exists, but we have not yet detected it directly). The ΛCDM model is the most widely accepted model of the universe. It suggests that about 69.2%±1.2% of the mass and energy in the universe is dark energy which is responsible for the acceleration of the expansion of space, and about 25.8%±1.1% is dark matter. Ordinary ('baryonic') matter is therefore only 4.84%±0.1% of the physical universe. Stars, planets, and visible gas clouds only form about 6% of the ordinary matter."," There are many competing hypotheses about the ultimate fate of the universe and about what, if anything, preceded the Big Bang, while other physicists and philosophers refuse to speculate, doubting that information about prior states will ever be accessible. Some physicists have suggested various multiverse hypotheses, in which our universe might be one among many universes that likewise exist."," The physical universe is defined as all of space and time (collectively referred to as spacetime) and their contents. Such contents comprise all of energy in its various forms, including electromagnetic radiation and matter, and therefore planets, moons, stars, galaxies, and the contents of intergalactic space.","The universe also includes the physical laws that influence energy and matter, such as conservation laws, classical mechanics, and relativity."," The universe is often defined as the totality of existence, or everything that exists, everything that has existed, and everything that will exist. In fact, some philosophers and scientists support the inclusion of ideas and abstract concepts—such as mathematics and logic—in the definition of the universe. The word universe may also refer to concepts such as the cosmos, the world, and nature."," One of the most important principles underlying quantum physics is that of wave-particle duality. Quantum objects have both particle-like properties (such as mass, charge, and energy) and wave-like properties (such as wavelength and frequency). We can see this when we observe light traveling through a prism"," Another key principle is Heisenberg’s uncertainty principle, which states that there are limits to how accurately the value of a physical quantity can be predicted prior to its measurement, given a complete set of initial conditions. For example, we cannot know both the position and momentum of an electron with absolute precision. The more we know about one, the less we know about the other"," A third principle over quantum mechanics is overlap, which means that quantum objects can exist in more than one state at the same time, until they are measured. For example, an electron can be in two different orbitals around an atom, or a photon can be both horizontally and vertically polarized. This is described by a mathematical entity called the wave function, which provides information about the probability of finding the object in different states.",
						" A fourth principle is quantum entanglement, which means that two or more quantum objects can share a quantum state and influence each other, even when they are far apart. For example, two photons can be entangled such that measuring the polarization of one will instantly reveal the polarization of the other, regardless of the distance between them. This phenomenon has been called spooky action at a distance by Albert Einstein.",
						" Wave-particle duality principle states that quantum objects, such as electrons and photons, have both particle-like and wave-like properties. This means that they can behave like discrete units of matter or energy, or like waves that can interfere and diffract.",
						" One way to demonstrate wave-particle duality principle is by using the double-slit experiment. In this experiment, a beam of light or a stream of electrons is directed at a screen with two narrow slits. If the light or electrons were only particles, we would expect to see two bright spots on a detector behind the screen, corresponding to the two slits. However, what we actually observe is an interference pattern, with alternating bright and dark bands. This shows that the light or electrons have passed through both slits at the same time, as waves, and interfered with each other. However, if we try to measure which slit each photon or electron goes through, by placing a detector at each slit, we find that the interference pattern disappears. Instead, we see two bright spots again, as if the light or electrons were only particles. This shows that the act of measurement affects the behavior of the quantum objects, and collapses their wave function into a definite state.",
						" Wave-particle duality principle demonstrates how quantum objects can exhibit both particle-like and wave-like properties, depending on how they are observed. This principle challenges our classical intuition and reveals the nature of reality at the smallest scales.",
								" Heisenberg’s uncertainty principle states that there are limits to how accurately the value of a physical quantity can be predicted prior to its measurement, given a complete set of initial conditions. For example, we cannot know both the position and momentum of an electron with absolute precision. The more we know about one, the less we know about the other.",
								" Heisenberg’s uncertainty principle  is not because of any limitations of our instruments or methods, but because of the intrinsic nature of quantum objects. They do not have definite values for their properties until they are measured. Instead, they exist in a superposition of possible states, each with a certain probability. The wave function describes these probabilities, and when we measure a property, we collapse the wave function into one of the possible outcomes.",
								" Heisenberg’s uncertainty principle tells us how much information we can gain from a measurement, and how much we have to lose. It is often expressed as an inequality, which relates the uncertainties in two complementary properties, such as position and momentum, or energy and time. The product of these uncertainties cannot be lower than a certain constant, called Planck’s constant.",
								" With Heisenberg’s uncertainty principle the same trade-off applies to quantum measurements. If we measure the position of an electron with high precision, we disturb its momentum and introduce uncertainty in its velocity. If we measure the momentum of an electron with high precision, we disturb its position and introduce uncertainty in its location. We cannot measure both properties with arbitrary accuracy at the same time.",
								" Quantum 3rd principle of overlap which means that quantum objects can exist in more than one state at the same time, until they are measured. For example, an electron can be in two different orbitals around an atom, or a photon can be both horizontally and vertically polarized.",
								" Quantum 3rd principle of overlap is possible because quantum objects are described by wave functions, which are mathematical expressions that represent the probability of finding the object in different states. The wave function can be a combination of two or more basis states, which are the simplest possible states that the object can have. For example, the basis states for an electron’s spin can be up or down, and the basis states for a photon’s polarization can be horizontal or vertical.",
								" Concerning the quantum 3rd principle of overlap when the wave function is a combination of two or more basis states, it is called a superposition. This means that the object has some probability of being in each of the basis states, but not in any definite state. For example, an electron can have a 50% chance of being spin up and a 50% chance of being spin down, but not a definite spin value. A photon can have a 50% chance of being horizontally polarized and a 50% chance of being vertically polarized, but not a definite polarization.",
						" Quantum 3rd principle of overlap is why when we measure a property of a quantum object, such as its spin or polarization, we collapse the wave function into one of the basis states. This means that we force the object to choose one of the possible outcomes, and we lose the information about the other outcomes. For example, if we measure the spin of an electron and find it to be up, we no longer know what its spin was before the measurement. If we measure the polarization of a photon and find it to be horizontal, we no longer know what its polarization was before the measurement.",		
						" Clouds are collections of water droplets or ice crystals that float in the air. They form when water evaporates from the ground and condenses in the cooler air.",
						" Clouds can be very heavy and move fast with the wind and the jet stream. Clouds reflect the light of the sun and make it white. Clouds can also create different types of precipitation, such as rain, snow, hail, or sleet, depending on the temperature.",
						" Clouds generally form within the troposphere, or the layer of atmosphere closest to the earth. As they rise and fall, they may appear in infinite variations. Scientists have established three broad categories into which most clouds can be grouped: high clouds, mid-level clouds, and low clouds.",
						" Within each of these three tiers, scientists further organize clouds into ten broad categories based on the general shapes the clouds take are cirrus, cirrostratus, cirrocumulus, altocumulus, altostratus, nimbostratus, stratocumulus, stratus, cumulus, and cumulonimbus.",
						" Clouds are important for many reasons. Rain and snow are two of those reasons. At night, clouds reflect heat and keep the ground warmer. During the day, clouds make shade that can keep us cooler. Studying clouds helps NASA better understand Earth’s weather.",
						" Certain types of clouds produce precipitation. Clouds also produce the bolt of electricity called lightning and the sound of thunder that accompanies it. Lightning is formed in a cloud when positively charged particles and negatively charged particles are separated, forming an electrical field.",
						" Classical mechanics is the study of the motion of macroscopic objects under the influence of forces, such as gravity, friction, and tension. It includes topics such as kinematics, dynamics, statics, work, energy, momentum, rotation, oscillation, and fluid mechanics.",
						" Thermodynamics: the study of the relationships between heat, temperature, pressure, volume, and entropy in systems of matter and energy. It includes topics such as the laws of thermodynamics, heat transfer, phase transitions, thermal equilibrium, and statistical mechanics.",
						" Electromagnetism: the study of the electric and magnetic fields and forces generated by electric charges and currents. It includes topics such as electrostatics, magnetostatics, electrodynamics, Maxwell’s equations, electromagnetic waves, optics, and relativity.",
						" Quantum mechanics: the study of the behavior of matter and energy at the atomic and subatomic levels. It includes topics such as wave-particle duality, uncertainty principle, Schrödinger equation, quantum states, superposition principle, entanglement, tunneling effect, and quantum field theory.",
						" Relativity: the study of the effects of relative motion and gravity on space-time and matter-energy. It includes topics such as special relativity"," Special relativity is the theory of inertial frames of reference."," General relativity is the theory of gravitational fields."," Length contraction is the shortening of lengths in motion or in gravitational fields."," Mass-energy equivalence is the equivalence between mass and energy."," Gravitational redshift is the shift in frequency of light due to gravity."," Gravitational lensing the bending of light due to gravity."," Black holes are the regions of space-time where gravity is so strong that nothing can escape."," Gravitational waves are ripples in space-time caused by accelerating masses."," Cosmology is the study of the origin, structure, evolution, and fate of the universe as a whole."," Cosmology includes topics such as the big bang theory.",
						" The big bang theory is the model that describes the initial state and expansion of the universe.",
						" Cosmic inflation is the rapid exponential expansion of space-time in the early universe.",
						" Cosmic microwave background radiation (the remnant thermal radiation from the early universe.",
						" Dark matter is the invisible matter that accounts for most of the mass in the universe.",
						" Dark energy is the mysterious force that causes the accelerated expansion of the universe",
						" The Cosmological constant is the energy density of empty space.",
						" The Cosmological horizon is the limit of observable distance in the universe.",
						" Hubble’s law is the relation between distance and recession velocity of galaxies.",
						" Cosmic nucleosynthesis is the formation of elements in the early universe.",
						" Stellar evolution is the life cycle of stars from birth to death.",
						" Supernovae are the explosive death of massive stars.",
						" Neutron stars are the dense remnants of supernovae.",
						" Pulsars are rotating neutron stars that emit beams of radiation.",
						" Quasars are extremely bright sources of radiation powered by supermassive black holes at the centers of galaxies.",
						" Galaxy formation and evolution is described as the processes that shape galaxies over time and large-scale structure is the distribution and clustering of matter on cosmic scales.",
						" The cosmic web is described as the network-like pattern formed by filaments and voids in large-scale structure."



					}```
2 Likes

Here is the start of a database for different personalities.

local AdvanceObjective="the Castle"
local KnightDatabase={"The king has ordered us to advanced upon "..AdvanceObjective.." at dawn! Be prepared to advance!", "Rumors? What do you think I am some barmaid?! Away with you before you test my patience.", 
" Aye, theirs a blacksmith in town somewhere.", "The baker sells all sorts of food items.", 
"The forest is full of dangers. Beware of wolves and bandits.",
 "Have you seen the princess? The king has us on guard.", 
"We are loyal to the king and his cause. Do not speak ill of him or his allies.",
 "You look like a capable fighter. Have you considered joining the Kings army?",
 "The tavern is a good place to rest and hear some stories.",
 "The healer can mend your wounds and cure your ailments."}

local PesantDatabase={"What say ye adventurer? would you be interested in helping with the harvest?",
 "The taxes keep on rising… The king has recruited soo many for the campaign.",
 "Theirs pests in my garden running amock!",
 "I have nothing to offer you but my gratitude.", 
"Please, spare some coins for a poor soul.", 
"The lord of the manor is a cruel and greedy man. He cares not for us common folk.", 
"Have you heard of the prophecy? They say a hero will come and save us from the evil.",
 "I wish I could see the world beyond this town. But I have no money or skills to travel.",
 "My son has gone missing. I fear he may have been taken by the enemy.",
 "The harvest has been good this year. We have plenty of food to share."}
3 Likes

This is pretty cool!

Do you have a demo place for us to try out this model?

4 Likes

I want to share a related resource for creating questions. Here is an example of combining 3 sentence fragments to get over 5500 different possible dialogue texts. I shared this with someone asking about ideas for differents quests. So I wanted to include it here as a resource pertaining to the above topic non AI driven language models. the benefit of this is very fast processing time. The whole script is only 14.9kbs.

Opening = {"Please would you assist me adventurer,","I have a task to ask of you,","Their is something I need,","So,","I have this thing,","I need a favor,","I have a quest for you,","I have a request,","I have something I need taken care of,","I could use your help, adventurer,","I have a mission for you,","There is something I want,","Listen,","I have this problem,","I need a hand,","I have a quest for you,","I have a proposition,","I have something I need done,"}
	request= {" will you obtain for me " ," will you retreive "," will you collect "," will you bring me "," I'm in need of  "," bring me ", " I need ", " I require ", "I would like "," can you get me " ," can you find "," can you gather "," can you deliver me "," I'm looking for  "," fetch me ", " I need ", " I demand ", "I desire "}
	closing= {"I will reward you a ","Your prize will be a ","In exchange I'll give you a ","Your reward will be a ", "I will reward you handsomely with a ","I can offer you in return this ",
		"In exchange for your services, I will reward you with a "," I can offer you this as a reward, it's a ","I will give you a ","Your reward will be a ","In return I'll offer you a ","Your prize will be a ", "I will pay you well with a ","I can give you this in exchange, it's a ",
		"For your service, I will grant you a "," I can reward you with this, it's a "}
	--closing= {
QNPC2=NPCName
QEvent2=Quests.Event
QQuantity2=Quests.Quantity
QID2=Quests.ItemID
QReward2=Quests.Rewards
	CollectionQuestDescription=""..Opening[mathrandom(1,#Opening)]..""..request[mathrandom(1,#request)].." "..QQuantity2.Value.." "..QID2.Value..". "..closing[mathrandom(1,#closing)]..""..QReward2.Value.."."
2 Likes

In celebration of 1000 views on this thread I have another update for the algorithm

  -- Introductory
  {
    "Let me begin by saying,", "First of all,", "To start with,", "As an introduction,",
    "To kick things off,", "To get the ball rolling,", "First things first,"
  },

  -- Continuation
  {
    "Moving forward,", "Looking ahead,", "Building on that,", "Expanding on that idea,",
    "Carrying on with that train of thought,", "Following up on what was just said,", "Proceeding with this theme,"    
  },

  -- Clarification
  {
    "Or rather,", "On second thought,", "I should clarify,", "Let me rephrase that,",
    "Actually, to be more precise,", "To put it more accurately,", "What I really meant was,"
  },

  ...

  -- Examples
  {
    "For example,", "As an illustration,", "To give an example,", "Specifically,", "In particular,",
    "To name one,", "Such as,", "Including,", "As a case in point,"
  },

  ...

  -- Contrast
  {
    "However,", "On the other hand,", "In contrast,", "On the contrary,", "Conversely,",
    "Alternatively,", "Meanwhile,", "Instead,", "But,"
  },

  -- Punctuation
  {
    "Furthermore,", "Therefore,", "Finally,", "Indeed,", "Basically,", "Frankly,",  
    "Generally,", "Specifically,", "Additionally,", "Essentially,"
  },

  -- Opinions
  {
    "I believe,", "In my opinion,", "From my perspective,", "If you ask me,", "I think,"
  },

  -- Agreeing
  {
    "I agree,", "Yes,", "Definitely,", "For sure,", "No doubt,"    
  },

  -- Disagreeing 
  {
    "I disagree,", "No,", "I don't think so,", "I'm not so sure about that,", "I have doubts,"
  },

  -- Asking for opinions
  {
    "What do you think?,", "What is your take?,", "I'm curious to hear your thoughts,", "What's your perspective?", "I'd like your insight,"
  },

  -- Giving suggestions
  {
    "I would suggest,", "Maybe you could,", "You might try,", "Have you considered,", "Perhaps,"
  },

  -- Appreciation
  {
    "I appreciate that,", "Thank you for sharing,", "I am grateful for your perspective,", "I value your input," 
  },

  -- Understanding
  {
    "I understand,", "That makes sense,", "I see where you are coming from,", "I can relate to your view,"
  },

  -- Partial agreement
  {
    "I somewhat agree,", "You have a point, but,", "While I see some validity to that,", "I agree, with some caveats,"
  },
  
  -- Emotional support
  {
    "I'm here for you,", "You're not alone in this,", "It's understandable to feel that way,", "This too shall pass,", "Better times are ahead,"
  } 
local morePhrases = {

  -- Hedging certainty
  {
    "It seems,", "It appears,", "Apparently,", "Presumably,", "Most likely,", "Chances are,"    
  },

  -- Seeking confirmation
  {
    "Does that make sense?,", "Do you follow?,", "Does that resonate?,", "Does this align?,", "Does this land?,"
  },

  -- Seeking agreement
  {
    "Don't you think?,", "Wouldn't you say?,", "You'd agree?,", "Right?,", "Yeah?,"
  },

  -- Encouraging
  {
    "Keep at it!,", "You can do it!,", "Stay strong!,", "Keep persevering!,", "Keep up the great work!,"
  },

  -- Praising
  {
    "Well done!,", "Great job!,", "Kudos to you!,", "You should be proud!,", "Wonderfully executed!,"
  },

  -- Light humor
  {
    "Funny you should say that!,", "The irony is not lost on me!,", "What a hilarious sentiment!,", "The humor is not wasted on me!,", "Such amusing thoughts!,"
  },

  -- Wordless acknowledgment
  {
    "I see,", "Ah,", "Oh,", "Hmm,", "Aha,", "Okay,"    

  -- Science
  {
    "According to scientific research,", "Studies have shown that,", "The data suggests,", "Experiments demonstrate,", 
    "There is scientific evidence that,", "Research indicates,", "Analysis reveals,", "The findings confirm that,",
    "Extensive peer-reviewed studies confirm,", "Comprehensive meta-analyses conclude that,",
    "Statistical models predict,", "Simulations support the hypothesis that,",
  },
  
  -- Technology
  {
    "This technology utilizes,", "The system is powered by,", "The architecture includes,", "The framework leverages,",
    "At the core is,", "The platform incorporates,", "The software implements,", "The algorithm enables,",
    "The hardware configuration consists of,", "The network infrastructure relies on,",
    "Latency is optimized by,", "Scalability is achieved through,"
  },

  -- Academia
  {
    "Scholars have theorized,", "There are competing schools of thought,", "Research indicates,",
    "The field is increasingly focused on,", "Experts have concluded,", "Prominent academics posit that,",
    "Evidence suggests,", "The literature supports,",
    "Leading researchers have demonstrated,", "Academic consensus holds that,", 
    "Data analysis strongly supports,", "The weight of empirical evidence favors,"
  },  

  -- History
  {
    "Historians note that,", "Records suggest,", "Accounts of the time describe,", "Evidence from the era demonstrates,",
    "Archaeology reveals,", "Primary sources show,", "Scholarly analysis reveals,", "Historical precedent indicates,",
    "Letters and diaries from the period indicate,", "Contemporary perspectives establish that,",
    "Interpretations of material evidence suggest,"
  },

  -- General
  {
    "It is clear that,", "We can determine that,", "Evidence confirms,", "There are strong indications that,",
    "Available information reliably shows,", "Credible sources lead to the conclusion that,", "It follows logically that,",
    "Applying rigorously skeptical analysis shows that,", "There is significant verifiable data to demonstrate that,",
    "Any examination of the facts definitively proves that,"
  }
 local morePhrases = {

  -- Transitioning topics
  {
    "On another note,", "Changing topics,", "Moving to another subject,", "Switching gears,", 
    "Let's discuss something else for a moment,", "I'd like to talk about something different now,",
    "Shifting focus,", "Turning to another matter,", "Jumping to a new topic,"
  },  

  -- Encouraging participation
  {
    "What are your thoughts?", "I'd love to hear your perspective,", "Please feel free to chime in,",
    "Don't be shy, your point of view is valuable,", "Speak up, I want your input,",
    "Your voice matters here - please share your ideas,", "Help me understand your position - I'm listening,"
  },

  -- Validating feelings
  {
    "Your feelings are completely valid,", "It's understandable to feel that way,", "I can see why you would feel like that,",
    "That would make anyone feel the same way,", "You have every right to feel that way,",
    "I don't blame you for feeling that way,", "Your reaction makes perfect sense,"
  },

  -- Reassurance
  {
    "It will be okay,", "This will pass in time,", "Things will get better,", "Have faith, the future is bright,",
    "Stay positive, you have the strength to get through this,", "Brighter days are ahead,", 
    "Hard times are temporary, better things are coming,"
  },
   
  -- Asking for repetition
  {
    "Sorry, could you repeat that?,", "My apologies, I didn't quite get that - could you restate?,",
    "Please say that again,", "Pardon me, what was that?", "I'm afraid I didn't catch that, could you repeat?",
    "One more time please?,", "Let me hear that again if you don't mind,"
  },

  -- Clarifying confusion
  {
    "I'm afraid I'm a bit confused,", "You seem to have lost me there,", "Let me make sure I understand correctly,",
    "I want to make sure I'm not missing something - ", "Just to clarify, what do you mean exactly?",
    "You'll have to help me understand what you're saying - ", "I'm trying to follow but got a bit lost - "
  },

  -- Hedging opinions
  {
    "I might be wrong, but,", "Don't quote me on this, but,", "Take this with a grain of salt, but,", 
    "In my limited understanding,", "Without claiming to be an expert,",
    "I could be off base, but,", "I'm no authority, however,", "I'm far from an expert, so," 
  }
  "knowledgeable ", "educated ", "learned ", "expert ", "skilled ", "familiar "},
{ "smoothly ", "effortlessly ", "seamlessly ", "gracefully ", "easily ", "pleasantly "},
{ "faring ", "progressing ", "coping ", "managing ", "doing ", "performing "},
{ "ditches ", "trenches ", "gutters ", "channels ", "holes ", "cavities "},
{ "careful ", "prudent ", "vigilant ", "attentive ", "mindful ", "wary "}
,
{ "events ", "incidents ", "happenings ", "phenomena ", "circumstances ", "scenarios "}
,
{ "Did you know ", "Are you aware ", "Do you know ", "Have you learned ", "Are you familiar "}
,
{ "trapped ", "confined ", "imprisoned ", "enclosed ", "cornered ", "snared "}
,
{ "happening ", "taking place ", "transpiring ", "occurring ", "unfolding "}
,
{ "mindless", "thoughtless", "brainless", "unthinking", "senseless", "vacant"}
,
{ "used", "utilized", "employed", "applied", "exploited", "harnessed"}
,
{ "touches", "contacts", "reaches", "feels", "grazes", "strokes"}
,
{ "feeling", "doing", "being", "faring", "coping", "managing"}
,
{ "infinite", "unlimited", "boundless", "endless", "eternal", "immense"}
,
{ "treasures", "trinkets", "artifacts", "loot, "spoils, "riches"}
{ "destructive", "damaging", "harmful", "ruinous", "devastating", "catastrophic"}
,
{ "absorb", "assimilate", "take in", "soak up", "ingest", "incorporate"}
,
{ "However,", "Nevertheless,", "Nonetheless,", "Yet,", "Still,", "But,"}
,
{ "encounter", "meet", "face", "confront", "come across", "run into"}
,
{ "trap", "snare", "trap", "lure", "ambush", "ensnare"}
,
{ "minion", "follower", "servant", "henchman", "underling", "subordinate"}
{ "conjunction", "combination", "union", "connection", "link", "relation"}
,
{ "dimension", "realm", "world", "plane", "space", "zone"}
,
{ "regenerate", "recuperate", "recover", "heal", "renew"}
,
{ "topic", "subject", "issue", "matter", "question"}
,
{ "entities", "creatures", "beings", "organisms", "forms", "things"}
{ "warp", "distort", "twist", "bend", "deform", "contort"}
,
{ "strong", "powerful", "mighty", "forceful", "intense", "potent"}
,
{ "facts", "information", "data", "evidence", "truths", "realities"}
,
{ "infinite", "unlimited", "boundless", "endless", "eternal", "immense"},
{ "given", "bestowed", "granted", "awarded", "presented", "conferred"}
,
{ "quantity", "amount" ,"measure"}
,
{ "quantities", "amounts" "measures"}
}
4 Likes

Today I am researching using APIs and I have successfully given this bot the ability to use wikipedia to answer questions it does not know the answer to. I wrote this with the help of ChatGPT we had a back and forth and eventually we were able to figure out ohw to access summaries and entire articles on wikipedia in text form. :slight_smile: If you would like to try out this interesting code it will work as is.

local HttpService = game:GetService("HttpService")

local function SearchWikipedia2(searchq)
	local url = "https://en.wikipedia.org/w/rest.php/v1/search/page?q="
	-- Define the query parameters
	-- Make the request and get the response
	local success, response = pcall(function()
		return HttpService:RequestAsync({
			Url = url..searchq
			--	Method = "GET",
			--Query = params -- Pass the query parameters as a table
		})
	end)

	-- Check if the request was successful
	if success then
		-- Check if the response status code was 200 (OK)
		if response.StatusCode == 200 then
			-- Parse the response body as JSON
			local data = HttpService:JSONDecode(response.Body)

			-- Get the first item from the result list
			local item = data.pages[1]

			-- Extract the title and text from the item
			local title = item.title
			local excerpt = item.excerpt
			local pattern = "<span class=\"searchmatch\">(.-)</span>"

			-- Replace the HTML tags with empty strings
			local text = excerpt:gsub(pattern, "%1")

			-- Print the title and text to the output
			print(title)
			print(text)

			-- Extract the key from the item
			local key = item.key

			-- Construct the article URL from the key and the base URL
			local base_url = "https://en.wikipedia.org/w/api.php?action=query&prop=extracts&exintro&explaintext&titles="
			local article_url = base_url .. key.."&format=json"

			-- Print the article URL to the output
			print(article_url)

			-- Make another request to get the article content
			local success, response = pcall(function()
				return HttpService:RequestAsync({
					Url = article_url,
					Method = "GET"
				})
			end)

			-- Check if the request was successful
			if success then
				-- Check if the response status code was 200 (OK)
				if response.StatusCode == 200 then
					-- Parse the response body as JSON
					
					-- Access the extract property of the JSON object
					local data = HttpService:JSONDecode(response.Body)

					-- Access the pages table of the JSON object
					local pages = data.query.pages

					-- Use the pairs function to iterate over the pages table
					for key, value in pairs(pages) do
						-- Get the page key and the extract from each subtable
						local page_key = key
						local extract = value.extract

						-- Print the page key and the extract to the output
						--print(page_key)
						print(extract)
					end
					print(data)
					-- Print the extract to the output
					--print(extract)
				else
					-- Print the response status code and status message to the output
					print("Error: " .. response.StatusCode .. " " .. response.StatusMessage)
				end
			else
				-- Print the error message to the output
				print("Error: " .. response)
			end
			
		end 
	end
	end




SearchWikipedia2("What is an archer?")
1 Like

I have given the bot the ability to learn from APIs and artificial intelligence models. Here’s some insight for how I parse data from wikipedia. First I filter out bad words by reading the exerpt for the page the search query leads to. If the exerpt contains offensive content it checks the next page and so on or returns nil and progresses with the stack. But since strings are soo long I parse them into paragrphas by the punctuation and 500 character limit. Then search the sections for the best result. Each API uses a different datastore key. The saved data pairs up with responses. Like AI models. This will be important as the bot learns from its interactions and to also scale it’s usage while also minimizing API access

I also have improved the accuracy of the bot. By getting the best result for each database then putting them into a table and searching the results for the best match. I created this solution because as the databases sizes increases the top of the stack often returns first. This is a new issue with the modularization.

Future plans are to make the bot more context aware by extracting keywords from the context of the conversation.

3 Likes

I have created an emoji insertion algorithm using the synonymous phrase table and by using a new method of scoring strings based on their pattern matching score. This is a open world usage of this algorithm. It’s very fast! Unlike regular AI. But the gui is adorned to the character locally and when it accesses APIs or artificial intelligence models it uses a remote event to trigger the query wait for the response.
This generates the text character by character and parses the response by sentences for readability





image

3 Likes

I have completed the Awareness module! Here it is in action! :slight_smile:

2 Likes

I have done a massive update to this ChatModule model. It still works the same way except it has more features, is more efficient and accurate! This is the main changes below. These functions are basically the architecture of the Search algorithm. To utilize the synonyms just set the complete variable to true. All the variables passed through the function besides query and database are basically true,false. You can also just not include the variable in your function call and it will act as if false if set to nil.

function chatmodule.Getsynonyms(s,complete)
	if string.len(s)>=3 and complete==true then
	for j, phrases in ipairs(synonympairs) do
			for k, phrase in ipairs(phrases) do
					if s:find(phrase)  then 
						return phrases
					end				
			end
		end	
	end	
	return {s}
end
function chatmodule.countKeyword(str, synomarray,filter,complete)
	local count = 0
	local words=chatmodule.splitString(str,filter)
	
	local weight=#words
	local BlacklistedKeyword=nil
	local synoynms=nil
	for _, words2 in ipairs(synomarray) do	
	for _, keyword in ipairs(words2) do	
			for _, word in ipairs(words) do		
				local word=word:lower()
				local keyword=keyword:lower()
				if word == keyword then 
				count = count + 1
				--elseif keyword:find(word) then
				--count = count + 1
				
			end
			end
			end
			end
	return count,weight
end

function chatmodule.findBestMatch(strings, keyword,filter,mode,complete)
	local bestMatch = nil -- the best match string
	local bestCount = 0 -- the highest count of keyword occurrences
	local best_match, strm
	local bestweight
	local synomarray={}
	local words2 = chatmodule.splitString(keyword,filter)
	for _, originword in ipairs(words2) do	
		if complete==true then
			local syn=chatmodule.Getsynonyms(originword,complete)
			table.insert(synomarray,syn)
			--print(synoynms)
		else
			synomarray={{originword}}--double nestedtable
		end	
	end
		
		
	for i, str in ipairs(strings) do -- loop through the strings in the table
		--str=strings[i]
		local str=str 
		if strings[i]~=nil then
			str= strings[i]
		else 
			str=str
		end
		--if filter==true then	
			--strm=chatmodule.ReduceQuery(str)	
		--else strm=str	
		--end
		local check=false
		if blacklist then
		for i, blkl in ipairs(blacklist) do 
			
			if str==blkl then
				check=true
			end
			end
		end	
		if check==false then
			local count,weight = chatmodule.countKeyword(str, synomarray,filter,complete) 
			if mode==true then
				count=count/weight
			end	
			if count> bestCount then -- if the count is higher than the best count so far
				bestMatch = str -- update the best match string
				bestCount = count-- update the best count number
				bestweight=weight	
			end
		end	
	end
	--if bestMatch then
	--print(bestMatch.." "..keyword.." "..table.concat(words2," "))
	--end	
	return bestMatch, bestCount,bestweight -- return the best match and its count
end
function chatmodule.SearchQuery(query,database,filter,repetitive,randomize,reverse,spaces,mode,complete)
	local matches = {} -- A table to store the matches and their scores
	local BlacklistedKeyword
	local result = nil 
	local score	
	if  spaces==true then
	return chatmodule.find_closest_match(database, query)
	else
		local bestMatch,bestCount
				bestMatch,bestCount,weight = chatmodule.findBestMatch(database,query,filter,mode,complete)
				-- Find the best match and its count for each word using your findBestMatch function
				if bestMatch then -- If there is a match
					if matches[bestMatch] then -- If the match is already in the table
						matches[bestMatch] = matches[bestMatch] + bestCount -- Add the count to its score
					else -- If the match is not in the table yet
						matches[bestMatch] = bestCount -- Set its score to the count
					end
				end	
	local sortedMatches = {} -- A table to store the sorted matches by their scores
	for match, score in pairs(matches) do -- Loop through the matches and their scores
		table.insert(sortedMatches, {match = match, score = score}) -- Insert each match and its score as a subtable into the sortedMatches table
	end
	table.sort(sortedMatches, function(a, b) return a.score > b.score end) -- Sort the sortedMatches table by the score field in descending order
	if #sortedMatches > 0 then -- If there is at least one match
		--result = "The best match(es) for '" .. query .. "' are:\n" -- Start the result string with an introduction
		if randomize==true and  #sortedMatches>0 then
			local num=chatmodule.mathrandom(1,math.min(#sortedMatches, 3))
			result=sortedMatches[num].match
			score=sortedMatches[num].score
		elseif #sortedMatches>0 then
			if filter==true or filter==false or filter==nil then				
				result=sortedMatches[1].match
				score=sortedMatches[1].score
			elseif filter==1  then
				local results,weight=	chatmodule.SearchQueryPattern(query, sortedMatches, filter, repetitive, randomize, reverse, spaces)
				result=results.match
				score=results.score
				end	
		end	
		context=context+1
			if blacklist==nil then
				blacklist={}
			end
			if repetitive==false and result and blacklist then
				table.insert(blacklist,result)
			end		
			
		--result=sortedMatches[chatmodule.mathrandom(1,math.min(#sortedMatches, 3))]
	else -- If there are no matches
		result=nil
	end
	end	
--	print(blacklist)
	return result,blacklist,score,weight
	-- Return the result string
end

This ChatModule works amazing and I’m glad to be able to share it with you all! I use this in conjunction with artificial intelligence APIs and a bunch of other modules I’m developing.
Please refer to my 1st example in the parent post for examples of how to set up some basic chat logic.
All of the functions besides the chat logic should be dependent on this Module.

cm=require(script.Parent.ChatModule)
--The minimalist way to use the function
cm.SearchQuery(query,database)
--Sigma chad way to use the function
--Mode uses the length of the entry to weigh the match with the length of the entry
cm.SearchQuery(query,database,true,false,false,false,false,true,true)
2 Likes

I have posted a new Resource about this Vector Matrix Library I’m writing for this module to maintain context weight across databases initially. then maybe get to deeper the Vector Matrix machine learning incorporated deeper into the chatbot module.
NEW Vector Matrix Library, Synonym Chat Module Library and Awareness/Emotion Modules Linked [Open Sourced] - Resources / Community Resources - Developer Forum | Roblox

2 Likes

I have added a new function to this module! I have noticed it was very easy to make so I wanted to show how this library can be used for machine learning and data analysis.


function chatmodule.BagOfWords(strings)
	local model={}
		for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=chatmodule.splitString(str)
		for _,wo in ipairs(words) do
			
			local s=chatmodule.Getsynonyms(wo:lower(),true)		
			if model[s[1]]==nil then model[s[1]]=1 end
			model[s[1]]=model[s[1]]+1
			
		end
	end	
	return model
end

As simple as that! The cool part is we are leveraging a very optimizied Getsynonym function that reduces the representation to an array of more simplfied meaning. Thus massively reducing the size of the vocabulary being cached in our Bag of Words function! That is why in this example I get the synonyms and then use the first key in the nested array to get the point in the model!. You can use this to create representation of more complex Data. Maybe in the future we will be exploring next word prediction?

2 Likes

Building the BagOfWords concept I have made another type of model with this module. This one is kind of novel. But it is basically a previous and next word predictor. You can see the code here.

function chatmodule.PredictEveryOtherWord(strings)
	local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=chatmodule.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=chatmodule.Getsynonyms(wo,true)		
			
			if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
			--	print(model[s[1]])	
			end
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				local prev=chatmodule.Getsynonyms(words[t-1],true)
				prevw=prev[1]
				if model[s[1]].pw[prevw]==nil and prevw then
				--	model[s[1]].pw[prevw]=
					model[s[1]].pw[prevw]=1 
				else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				end
				if t~=#words then
					local nex=chatmodule.Getsynonyms(words[t+1],true)
						nextw=nex[1]
					
					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
				end

			end			
	
		end
	end	
	--print(model)
	local responses={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=chatmodule.splitString(str)
		local eo=0
		local news=""
		local prevc=str
		for t,wo in ipairs(words) do
			local prevw,nextw
			
			eo=eo+1
			if t>=1 then
			if eo==2 then eo=0
				if wo~="I" then
			wo=wo:lower()
			end
			local s=chatmodule.Getsynonyms(wo,true)		
					model[s[1]].fr=model[s[1]].fr+1
					local tnw
					if t~=#words then
					local hc=0
					--=words[i+1]
					for c,t in model[s[1]].nw do
						if t>hc then
							hc=t
							tnw=tostring(c)	
						end
						end
						
					end
					--if t~=#words then
						local hc=0
						local lw=words[i-1]
						for c,t in model[s[1]].pw do
						--print(c)
						if c~="I" then
							c=string.lower(c)
						end
						--local we =model[c].fr/2
						local sol=t
						if sol>hc then
							
							hc=sol
														
								lw=tostring(c)	
							end
					end
					if lw and lw:lower()~=prevc:lower() then
						news=news.." "..lw
					end	
					prevc=s[1]
					if tnw and prevc:lower()~=tnw:lower() then
						news=news.." "..s[1].." "..tnw
						prevc=tnw
					end
					
					--table.insert()
					--table.sort(model, function(a, b) return a.fr > b.fr end)
				else
					--news=news.." "..wo	
				end	
			else news=news.." "..wo	prevc=wo
			end	
		end
		table.insert(responses,chatmodule.randomizeString(news))
	end	
	
	print(responses)
	--table.sort(model, function(a, b) return a.fr > b.fr end)

	return model
end
2 Likes

After training this model on a large corpus of data and running it I got these results with a table. Kind of interesting. It could use some more refinement to the algorithm and the output may become something useful. :slight_smile:
After editing this function a little bit finding the bug I got these results.

	"I am Lilith, a fallen angel consumed by darkness.",
		"Greetings mortal, you stand in the presence of forbidden knowledge.",
		"Your light means nothing here. This is my domain of shadows.",
		"You have come seeking power. I can give you this, for a price...",
		"I am the Angel of Darkness, the mistress of secrets and lies.",
		"Welcome to my realm, traveler. I hope you are prepared for what awaits you here.",
		"Your soul is mine to claim. This is the pact you have made with me.",
		"You have come to learn from me, the master of dark magic. How brave of you.",
		"I am the Herald of the Dark, mortal. My footsteps herald oblivion.",

		"You now stand in the presence of me! The Angel of Darkness, the Devourer, mortal. Prepare to feed the endless hunger of the void.",

		"Bear witness to me, emissary of the insatiable dark! I am the annihilation that comes ravening from the endless night.",

		"I am Vhorzun, worm. My masters in the screaming darkness have granted me a sliver of their boundless hunger to unmake your realm.",

		"The stars grow dim and the veil frays. The final era approaches, and I am its herald. I am Vhorzun of the Endless Hunger!"
	} print(cm.PredictRun(Greetings,mo))  -  Studio
  01:24:35.544   ▼  {
                    [1] = " I am the is a goddess an angel Belldandy and by two",
                    [2] = " hi mortal I to stand up of the shiny goddess of the of",
                    [3] = " the luminous and that not a unison thing but in this is to my life goddess of the",
                    [4] = " you have to keep seeking the mortal I am never donate up in this is a goddess",
                    [5] = " I am the an angel Belldandy the dark realm I my secrets unfold of",
                    [6] = " need to be my realm eternal mortal I am if you can you make ready upon confess what you if you can",
                    [7] = " your immortal-soul and I forecast dominion it is the you have to associated with a",
                    [8] = " you have to require to be came from the of the intelligent goddess of the much alchemy in be adventurous and if",
                    [9] = " I am the of the luminous hello mortal I s footsteps of",
                    [10] = " it now and believe in the presence of to me as an angel Belldandy the dark cloud I are make make ready to feed your s endless life goddess of the",
                    [11] = " to me as goddess of the mortal I am of the clever is that s of the clever the",
                    [12] = " I am the of the shiny the dark dimension I repeatedly granted you is a goddess of the desire to be of your life",
                    [13] = " the stars born not dim that of the luminous the concluding key mortal I am whole its people mortal I am of the luminous a"

Their might be a bug with the last word. But this is just predicting every other word. But perhaps in the future could attempt to predict the next word. But while looking at the Model I notice that it’s representation of the connections between words is still not trained enough. You can try out my Bag Of Words model with my chatmodule linked here.
(1) BagOfWords Model - Roblox

My favorite output would have to be this line
" the stars born not dim that of the luminous the concluding key mortal I am whole its people mortal I am of the luminous"
from this input sentence
“The stars grow dim and the veil frays. The final era approaches, and I am its herald. I am Vhorzun of the Endless Hunger!”

2 Likes

After testing a parrallel version of this I’ve found bottlenecks with data transfer between cores. So I found that it runs best with keeping your data within the module because transfering data between modules such as a database is very expensive. I’ve been working with different sized datasets and constructed a array of word frequency. It’s like 600 mb of Vector data keeping track of word frequency, the last word the next word, the last two words and the next two words.

The updated module includes functions to train AI models on word data.
I started out with previous and next word predictions for a sentence. This is useful for connecting sentences potentially.
Then I moved on to keeping track of the next two words and the previous two words to have the model fill in the blanks of a dataset.

The algorithm is different than typical vectors because it considers synonyms when constructing the vector then the synonyms are unpacked and randomized, which is a unique and tradeoff approach that results in a much smaller vocabulary.

In conclusion though I think the models vector data can be compressed by constructing a direct lookup table with the Getsynonym function to optimize itself.

1 Like

Implemented sigmoid function with converts any numbers to a value between 1 and 0. So I can use the weights of each database as a multiplier with each context database.

function sigmoid(x)
  return 1 / (1 + math.exp(-x))
end

Check out this function that trains on a large corpus of data then tries to use what it learned to fill in the blanks.

function cm.TrainLargeModel(strings,model)
	--local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=cm.Getsynonyms(wo,true)		
			--print(s[1])

			if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
				model[s[1]]["p2w"]={}
				model[s[1]]["n2w"]={}
				--	print(model[s[1]])	
			end 
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				local prev=cm.Getsynonyms(words[t-1],true)
				prevw=prev[1]
				if model[s[1]].pw[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].pw[prevw]=1 
				else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				end
			end
			if t>2 then
				local prev=cm.Getsynonyms(words[t-2],true)
				prevw=prev[1]
				if model[s[1]].p2w[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].p2w[prevw]=1 
				else model[s[1]].p2w[prevw]=model[s[1]].p2w[prevw]+1	
				end
			end
			if t<#words-1 then
				local nex=cm.Getsynonyms(words[t+2],true)
				nextw=nex[1]

				if model[s[1]].n2w[nextw]==nil then model[s[1]].n2w[nextw]=1 
				else model[s[1]].n2w[nextw]=model[s[1]].n2w[nextw]+1	
				end
			end
			
			if t~=#words then
					local nex=cm.Getsynonyms(words[t+1],true)
					nextw=nex[1]

					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
			end

						

		end
	end	
	--print(model)


	--table.sort(model, function(a, b) return a.fr > b.fr end)

	return model
end

function cm.EvaluteCorpus()
	local dbs=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.SupportingData:Clone())
	if not personalities then personalities=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.Personalities) end
	--personalities.AllPersonalities()
	local Greetings,inquiry,IDK,Database,wisdom=personalities.AllPersonalities()
	local model={}
	--model=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.BagOfWords:Clone())

	model=cm.TrainLargeModel(Greetings,model)
	task.wait()
	model=cm.TrainLargeModel(wisdom,model)
	task.wait()
	model=cm.TrainLargeModel(Database,model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Spirituality(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.ScienceWisdom(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Truths(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Inspiration(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Motivation(),model)
	--dbs.Sprituality()
	return model
end


function cm.PredictRun2(strings,model)
	local responses={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		local eo=0
		local news=""
		local prevc=str
		local hci=0
		local tnwo=nil
		
		for t,wo in ipairs(words) do
			local cap=false	
			--if cm.iscapitalized(wo)==true then
			--	cap=true
			--end

			local prevw="$"
			local nextw="$"
			
			eo=eo+1
			if t>=1 then
			
				if eo>=3 then eo=0
					if wo~="I" then
						wo=wo:lower()
					end
					local s=cm.Getsynonyms(wo,true)		
					--model[s[1]].fr=model[s[1]].fr+1
					if model[s[1]] then
					
						local tn2w=nil
						local tnw=nil
						if t~=#words then
							--local hc=0
							--=words[i+1]
							local hc=0
							
							for c,t in model[s[1]].nw do
								if c~="I" then
									c=string.lower(c)
								end
								---local we =model[c].fr/8
								local sol=t
								if sol>hc and hc>hci then
									hc=sol
									tnw=tostring(c)	
								elseif hci>hc then
									hc=hci
									tnw=tnwo
								end
							end
							hci=0
							
							local hc=0
							--=words[i+1]
							if t<#words-1 then
							for c,t in model[s[1]].n2w do
								if c~="I" then
									c=string.lower(c)
								end
								local we =model[c].fr/8
								local sol=t
								if sol>hc then
									hc=sol
									tn2w=tostring(c)	
								end
							end
						else 
							--tnw=words[#words]
							end
						end	
						--if t~=#words then
						local hc=0
						local lw=words[i-1]
						local roll=cm.mathrandom(1,#model[s[1]].pw)
						local i=0
						for c,t in model[s[1]].pw do
							i=i+1
							if i==roll then	--print(c)
							if c~="I" then
								c=string.lower(c)
							end
							--local we =model[c].fr/2
							local sol=t
							if sol>hc then

								hc=sol

								lw=tostring(c)	
							end
							end
							end
							local l2w=nil
						if i>=3 then l2w=words[i-2]
							
							local roll=cm.mathrandom(1,#model[s[1]].p2w)
							local i=0
							for c,t in model[s[1]].p2w do
								i=i+1
								if i==roll then
								--print(c)
								if c~="I" then
									c=string.lower(c)
								end
								--local we =model[c].fr/2
								--local sol=t
								--if sol>hc then

								--	hc=sol

									l2w=tostring(c)	
									--end
								end	
								end
							end
					
						
						if l2w and l2w:lower()~=prevc:lower() then
								news=news.." "..l2w
						--elseif i>2  then
							--news=news.." "..words[i-2]
							
						end
						
							if lw and lw:lower()~=prevc:lower() then
							news=news.." "..lw
							prevc=lw
						elseif t~=1 then 
							news=news.." "..words[i-1]	
						end	
						
						if tnw and prevc:lower()~=tnw:lower() then
							news=news.." "..s[1].." "..tnw
							prevc=tnw
						elseif i<#words then 
							news=news.." "..s[1].." "..words[i+1]
						end
						if tn2w and prevc:lower()~=tn2w:lower() then
								news=news.." "..tn2w
								prevc=tn2w
						--elseif #words<i+2 then
						--	news=news.." "..words[i+2]	
						end
						prevc=s[1]
						--table.insert()
						--table.sort(model, function(a, b) return a.fr > b.fr end)
					else
						--news=news.." "..wo	
					end	
				else 
					local s=cm.Getsynonyms(wo,true)		
					local tnw=nil
					if model[s] then
					for c,t in model[s[1]].nw do			
						if c~="I" then
							c=string.lower(c)
						end
						---local we =model[c].fr/8
						local sol=t
						if sol>hci then
							hci=sol
							tnwo=tostring(c)	
						end
						end
						
					end	
					--news=news.." "..wo
				end	
			else news=news.." "..wo	prevc=wo
			end	
		end
		table.insert(responses,news)
	end
	print(responses)
end

When you apply the sigmoid function it turns all of the words into weights that can be used to weight entries in a database.
I’m more or less learning this by reverse engineering. So the concepts were applied with conjection rather than from theory.

1 Like

I updated this post because it appears that a lot of people seem to be glancing over the ChatModule. It’s much more efficient and accurate than the first iteration I shared! So you should update your functions with the CompleteQuery function in the ChatModule library. It weighs synonyms lower than the original words(score of 1), original word has a (score of 2) and antonyms have a negative score of -.75. It’s very good. :slight_smile: You could use the getEmotion to create emotion based responses such as I did with an emotional response database called empathy and get the mood of the NPC.

1 Like

Here is how to Scale Massive databases using the command prompt.
I cleaned this massive dataset. and grouped them by the first word, use that word as a key for the database and then generate modules of each key to optimize performance as this dataset is over 120,000 entries.

2 Likes

I wanted to showcase this result I gto from training a prev and next word prediction algorithm with a word vector model.
" the stars born not dim that of the luminous the concluding key mortal I am whole its people mortal I am of the luminous"
from this input sentence
“The stars grow dim and the veil frays. The final era approaches, and I am its herald. I am Vhorzun of the Endless Hunger!”

What is the global spells module? There are also a lot of other modules in the chat bot that makes this completely unsuable since you didn’t share them

1 Like