Eloquent Synergy

AI Associate Exam Glossary of Terms

 

If this is the first time you are coming across AI associate certification or new to the AI world, there will be quite a few terms which might feel unfamiliar to you. I have collated a glossary of terms which you’ll see repeatedly and should familiarize yourself with.

Keeping it short and simple, here they are:

    • NLP – Natural Language Processing – Interpreting everyday language and acting on it in a meaningful way
  • The simple set of rules for turning and putting it into an output is an example of an algorithm.
  • Machine Learning is a process of using large amounts of data to train a model to make predictions instead of handcrafting an algorithm. Learns and adapts based on patterns in data to make accurate predictions (How computers can learn new things without being programmed to do them)
  • Supervised Learning (learning from examples) is using structured data (organized & formatted) whist unsupervised learning (learning without guidance,, finding hidden patterns)   is using unstructured data (not formatted, images, audio, video etc)
  • Generative AI creates new content based on existing data prompted by a human

  • Neural Network is a web of connections, guided by weights and biases
  • Artificial neural networks being developed between data points in large databases is Deep Learning
  • Send Time Optimization helps predict the best time to send a communication for the highest response rate, specific to each person.
  • Einstein Engagement Frequency predicts the right number of communication to send without going overboard.
  • Natural Language Understanding (NLU) refers to systems that handle communication between people and machines. Data processed from unstructured to structured is NLU and vice versa is NLG (Natural Language Generation (NLG))
    • NLP is distinct from NLU and describes a machine’s ability to understand what humans mean when they speak as they naturally would to another human.
  • Named Entity Recognition (NER) labels sequences of words and picks out the important things like names, dates and times. NER involves breaking apart a sentence into segments that a computer can understand and respond to quickly.
  • Deep Learning refers to neural networks being developed between data points in large databases
  • Einstein Bots automatically resolve top customer issues, collect qualified customer information, and seamlessly hand off the customers to agents, meaning increased case deflection in the contact center and reduced handle times for agents.
  • Einstein Agent drives agent productivity across the contact center. Through intelligent case routine, automatic triaging and case field prediction, Einstein Agent significantly accelerates issue resolution and enhances efficiency.
  • Einstein Discovery helps managers take action with predictive service KPIs. By serving up real time analysis of drivers that impact KPIs, like churn or CSAT and suggested recommendations and explanations, managers are empowered to make more strategic decisions for their business.
  • Einstein Vision for Field Service automates image classification to resolve issues faster on-site. Just by taking a picture of the object, Einstein Vision can instantly identify the part, ensuring accuracy for the technician and boosting first-time fix rates.
  • Einstein Language brings the power of deep learning to developers. They can use pretrained models to classify text by the sentiment as either positive, neutral, or negative, and then be able to classify the underlying intent in a body of text. Put it all together, and you have the ability to process language across unstructured data in any reps.
  • Large Language Models (LLM) a deep learning algorithm that can perform a variety of natural language processing tasks. LLM use transformer models and are trained using massive datasets. (AI models are trained on massive amounts of data.
  • Transformer is a new architecture capable of identifying important relationships between words, no matter how far away they appear within a block of text. It could retain that connection even after processing lots of words.
  • Parallel Computing is where one computer processor can do the first calculation while a different processor does the second at the same time. This reduces the time it takes to train a transformer.






Comment

Your email address will not be published. Required fields are marked *

Maria Rajpoot

Maria Rajpoot

Short inro

Latest Nuggets

Events Gallery