One Article Review

Accueil - L'article:
Source ProofPoint.webp ProofPoint
Identifiant 8484113
Date de publication 2024-04-17 18:00:31 (vue: 2024-04-17 13:07:19)
Titre Réduire le désabonnement d'incitation avec une composition de modèle explosive
Reducing Prompting Churn with Exploding Template Composition
Texte Engineering Insights is an ongoing blog series that gives a behind-the-scenes look into the technical challenges, lessons and advances that help our customers protect people and defend data every day. Each post is a firsthand account by one of our engineers about the process that led up to a Proofpoint innovation.   In the nascent world of large language models (LLMs), prompt engineering has emerged as a critical discipline. However, as LLM applications expand, it is becoming a more complex challenge to manage and maintain a library of related prompts.   At Proofpoint, we developed Exploding Prompts to manage the complexity through exploding template composition. We first created the prompts to generate soft labels for our data across a multitude of models and labeling concerns. But Exploding Prompts has also enabled use cases for LLMs that were previously locked away because managing the prompt lifecycle is so complex.  Recently, we\'ve seen exciting progress in the field of automated prompt generation and black-box prompt optimization through DSPy. Black-box optimization requires hand-labeled data to generate prompts automatically-a luxury that\'s not always an option. You can use Exploding Prompts to generate labels for unlabeled data, as well as for any prompt-tuning application without a clear (or tractable) objective for optimization.   In the future, Exploding Prompts could be used with DSPy to achieve a human-in-the-loop feedback cycle. We are also thrilled to announce that Exploding Prompts is now an open-source release. We encourage you to explore the code and consider how you might help make it even better.   The challenge: managing complexity in prompt engineering  Prompt engineering is not just about crafting queries that guide intelligent systems to generate the desired outputs; it\'s about doing it at scale. As developers push the boundaries of what is possible with LLMs, the need to manage a vast array of prompts efficiently becomes more pressing. Traditional methods often need manual adjustments and updates across numerous templates, which is a process that\'s both time-consuming and error-prone.  To understand this problem, just consider the following scenario. You need to label a large quantity of data. You have multiple labels that can apply to each piece of data. And each label requires its own prompt template. You timebox your work and find a prompt template that achieves desirable results for your first label. Happily, most of the template is reusable. So, for the next label, you copy-paste the template and change the portion of the prompt that is specific to the label itself. You continue doing this until you figure out the section of the template that has persisted through each version of your labels can be improved. Now you now face the task of iterating through potentially dozens of templates to make a minor update to each of the files.  Once you finish, your artificial intelligence (AI) provider releases a new model that outperforms your current model. But there\'s a catch. The new model requires another small update to each of your templates. To your chagrin, the task of managing the lifecycle of your templates soon takes up most of your time.  The solution: exploding prompts from automated dependency graphs  Prompt templating is a popular way to manage complexity. Exploding Prompts builds on prompt templating by introducing an “explode” operation. This allows a few single-purpose templates to explode into a multitude of prompts. This is accomplished by building dependency graphs automatically from the directory structure and the content of prompt template files.  At its core, Exploding Prompts embodies the “write it once” philosophy. It ensures that every change made in a template correlates with a single update in one file. This enhances efficiency and consistency, as updates automatically propagate across all relevant generated prompts. This separation ensures that updates can be made with speed and efficiency so you can focus on innovation rather th
Envoyé Oui
Condensat /inst ``` ```text  ```  about above abstraction abundance accessible accomplished account achieve achieves across active actively add additional additionally adds adheres adjustments advances after against ahead: air all allows also alternative always analyze ancestor announce another answer any appear application applications applies apply approximate architecture    are array artificial assistant associated attributes author  automated automatically away backgrounds backticks because becomes becoming been before behind being below best better between black blend blog bool both boundaries box branching brands break brevity build building builds business but can career careers case cases catch chagrin challenge challenge: challenges change changes child churn class classified clear cloud code collaborate collaboration  combinations combinatorial common community complex complexity components composition concerns concise consider consist consistency consistent consists constantly constrained constructed consuming contains content continue contributions copy core correlates correspond corresponding costs could crafting create created creating crisp critical cronquist current customers cutting cycle data dataset day deep defend define defined defines defining definition definition: definitions dependency depending describe description descriptions desirable desired detail details detection determine developed developers differences different dimension directories directories; directory directory  disassembling discipline discuss discussion distinct diverse diversity does doing dozens dramatically driven driving dspy each easy edge efficiency efficiently embedded embodies emerge emerged enable enabled encapsulated encourage end engineer engineering engineering  engineers enhance enhances enjoys ensures environments error even every evolving examine example exciting exclude exercise expand experiences explained explode exploded explode” exploding explore exploring explosion explosive face feedback field figure file files filesystem filter final find finish first firsthand focus following follows:  force format forward found foundation four free from fully future generate generated generates generation generative get give given gives good graph graphs graphs  grasp guide hand happily has have having help helpful here hire his hope how however human impact implemented implemented   improved include includes incorporated indicating individual infer inference innovation innovative insight insights inst instead instructions intelligence intelligence    intelligent interested interpret introducing items iterating its itself jason jinja join joined json just keeping key keyword label labeled labeling labeling  labels labradoodle language large last later lead leaf learning led lessons let levels library lifecycle like live lived llm llms located locked look looking loop luxury machine made maintain maintenance make making malware manage managed management managing manual manually many maps may mechanism metadata metatag metatags methods might minor mistral mitigate model models more most multiple multitude must name named names nascent need needed negative nesting new next node nodes non not note nova now number numerous objective objects occupy offer often omitted once once” one ongoing only open operation opportunities optimization optimizing option other others out outcome outperforms output outputs; own page paired parameterized parameters parent pass passion passive paste people persisted philosophy piece pieces:  platform    popular portion positive possible post potentially practical pressing previously problem problems problem  process produce progress progresses project prompt prompting prompts prone proofpoint propagate propagates protect protecting proven provider pulls purpose push quantity queries questions random rapidly rather reader ready real reason recently recursive reduces reducing reference referenced related relatively release releases relevant rendered rendering replace required
Tags Malware Tool Threat Studies Cloud Technical
Stories
Notes ★★★
Move


L'article ne semble pas avoir été repris aprés sa publication.


L'article ne semble pas avoir été repris sur un précédent.
My email: