Ngimisele i-GPT-4 nge-SLM Yasendaweni futhi Ipayipi Lami le-CI/CD Liyekile Ukuhluleka

bhala kabusha ukwaziswa kwesistimu efanayo.
KUMELE ubuyisele KUPHELA i-JSON evumelekile. Akukho ukubeka phansi. Azikho izicingo zekhodi. Ayikho incazelo. NJE into ye-JSON.
Ngangibhale ukuthi KUFANELEKILE kuwo wonke ama-caps. Kumodeli yolimi. Njengokungathi ukugcizelela kuzosebenza kokuthile okungenayo imizwa noma, ngokusobala, incazelo engaguquki yokuthi “i-JSON evumelekile.”
Akusebenzanga. Nakhu okwenzile.
Igcine IGPT-4 Emsebenzini Weqoqo Lasebusuku
Ithimba lethu lisebenzisa amadokhumenti ocwaningo, njengama-PDF kanye nombhalo ongenalutho, futhi ngezikhathi ezithile leyo mibiko edabukisayo eyakhiwe kancane ekhishwe umthengisi ngokusobala esuka kuspredishithi abebeziqhenya ngayo. Futhi ingxenye yalelo payipi iyawahlukanisa futhi ikhiphe izinkambu ezihlelekile ngaphambi kokuthi noma yini ithinte indawo yokugcina idatha. Uhlobo lwendlela yokwenza, umthombo wesethi yedatha, amamethrikhi angukhiye.
Lokhu kuzwakala kuyinkinga exazululiwe. Ngokuvamile kuba njalo, kuze kube kukhona cishe izinhlobo ezingamashumi amane zezindlela ezisohlwini futhi imibhalo iyeka ukubukeka njengalezi oqeqeshwe ngazo.
Okwesikhashana, siphathe lokhu sisebenzisa i-regex, izikhipha ezisekelwe kumthetho, kanye nemodeli ye-BERT ecushwe kahle. Okuhle ukuthi, isebenzile, kodwa ukuyigcina izwakale njengokulungisa ifayela le-CSS kusukela ngo-2015, lapho uthinta umthetho owodwa kanye nokunye okuphuka okungahlobene ekhasini ongazange ulivakashele ezinyangeni.
Ngakho lapho kufika i-GPT-4, sayizama.
Ngeke ngiqambe amanga, kwakuyinto emangalisayo. Amakesi e-Edge abehlanyisa i-BERT izinyanga, amafomethi ebesingakaze siwabone ngaphambilini, namadokhumenti anezigaba ezingahambisani wonke anakekelwe ngokuhlanzekile yi-GPT-4.
Idemo yeqembu ihambe kahle. Ngisho, othile waze wathi “wow” ngokuzwakalayo.
Ngithumele umphathi wami umlayezo ngalobo busuku: “Ngicabanga ukuthi siyixazulule inkinga yokukhipha.” Kuthunyelwe ngokuzethemba.
Ngemva kwamasonto amabili siyifakile, ukwehluleka kwaqala.
Inkinga “Ngokuvumelana Kakhulu”
Kusukela kokuhlangenwe nakho kwami, i-GPT-4 iyakwazi futhi ayinqumi.
Ezimweni eziningi zokusetshenziswa, i-non-determinism ayinandaba. Ngephayiphi yenqwaba yasebusuku ephakela inqolobane yedatha, ibaluleke kakhulu.
Ngo temperature=0 uthola kakhulukazi imiphumela engaguquki. Kumongo we-CI/CD, “ikakhulukazi” kusho ukuthi “izophuka ngoLwesihlanu.”
Ukwehluleka kwakungeyona into emangalisayo; uma kungenjalo, lokho bekungaba lula ukulungisa iphutha. I-GPT-4 ibingezona izinkambu ezikhohlisayo noma ibuyisa udoti.
Lalikwenza izinto ezicashile. "dataset_source" ngobunye ubusuku, "datasetSource" okulandelayo, "source_dataset" ngobusuku obulandelayo. Ikhodi ye-Markdown ibiyela i-JSON nakuba besingayitsheli ukuthi ingenzeki, kuzo zonke izinguqulo zokwaziswa. Izinombolo zibuyiselwe njengezintambo. I-JSON null ibuyile njengochungechunge lwePython "None"; Ngichithe isikhathi eside kunengingathanda ukuvuma ngigqolozele leyo.
Ukwehluleka ngakunye kwakulandela isiko elifanayo.
I-Pydantic iyibambe ibheke ezansi nomfula, ipayipi liyahluleka, ngiyaqinisekisa ukuthi kungenye inkinga yokufometha, bese ngiqala ukusebenza ngokushesha ukuze ngiphinde ngiqalise ngesandla. Kudlula, bese kuthi emva kwezinsuku ezintathu, kuphinde kwenzeke okuhlukile.
Ngakho ngaqala ukugcina umbhalo. Amaviki ayisithupha akho:
- 23 ukwehluleka kwamapayipi kusuka kokungahambisani kokukhiphayo kwe-GPT-4
- ~ imizuzu eyi-18 isilinganiso sokuxilonga nokuphinda uqalise
- 0 izimbungulu zangempela kukhodi yephayiphi
Zero izimbungulu zangempela. Konke ukwehluleka okukodwa kwaba ukuthi imodeli ihluke ngokucashile kulokho ebiyikho ngayizolo. Yileyo nombolo engenze ngayeka ukuvikela ukusetha.
Engakuzama Ngaphambi Kokuvuma Inkinga Yangempela
Iyazisa
Isikhombisa sibhala kabusha ezinsukwini ezimbili.
Angizange ngibone nokuthi isithole okungako. Bengifuna ukulungiswa okuhle nje. Futhi akuzange kugcine lapho.
Ngizamile imiyalelo yama-caps, izibonelo ezimbalwa, kanye nezibonelo eziphikisayo ngesihloko esithi “UNGAkwenzi lokhu”. Ngize ngazama ukungeza isikhumbuzi ekugcineni komlayezo womsebenzisi njengokugudluza kokugcina, sengathi imodeli izofika kulowo mugqa bese icabanga. oh kulungile, JSON kuphela, cishe ngakhohlwa.
Ngesinye isikhathi, iziyalezo ezifanayo zavela ezindaweni ezintathu ezihlukene ngesikhathi esifanayo, futhi ngacabanga ngobuqotho ukuthi lokho kungasiza. Ukwehluleka kwaqhubeka, akushintshile.
Umhlahleli wokuhlanza
Lapho ukwaziswa kuhluleka, ngabhala ikhodi ukuze ngihlanze noma yikuphi ukungcola i-GPT-4 eyangibuyisela kimi.
Hlukanisa uthango lokumaka, thola into ye-JSON uma ibigqitshwe embhalweni, bese ufaka okuphumayo okungahluziwe uma kungasebenzanga lutho.
Le ndlela empeleni yasebenza isikhathi esingangesonto, okwakuyisikhathi esanele sokuba ngizizwe ngijabule ngayo.
Bese i-GPT-4 iqala ukubuyisela i-JSON evumelekile ngokwesakhiwo enamagama okhiye angalungile, ikamelaCase esikhundleni se-snake_case. Umhlahleli uwudlulise kahle, futhi iphutha lavela izinyathelo ezintathu kamuva.
Bengidlala i-whack-a-mole ngemodeli enamavukuzane angapheli.
response_format + temperature=0. I-OpenAI's response_format={"type": "json_object"} kuhlanganiswe ne temperature=0 kulethe ukwehluleka ukusuka ku-23 kwehle kuya cishe ku-9. Inqubekelaphambili enenjongo. Noma kunjalo hhayi uziro, futhi “ukwehluleka okungahleliwe okuyisishiyagalolunye ngamaviki ayisithupha” akuyona impahla yepayipi engingayivikela.
Umsebenzi wokushaya
Yilona olwacishe lwasebenza. Ukuphoqa okukhiphayo ngenkontileka ye-schema ethayiphiwe kuqinise izinto ngempela.
Ngayeka ukuhlola izingodo njalo ekuseni, ngayeka ukubhekela isaziso se-Slack. Utshele othile eqenjini ukuthi uzinzile. Okungukuthi, uma usebenza kusoftware, uyazi ukuthi iyindlela eshesha kakhulu yokwenza okuthile.
functions = [
{
"name": "extract_document_metadata",
"parameters": {
"type": "object",
"properties": {
"methodology_type": {
"type": "string",
"enum": ["experimental", "observational", "review", "simulation", "mixed"]
},
"dataset_source": {"type": "string"},
"primary_metric": {"type": "string"},
"year": {"type": "integer"},
"confidence_score": {"type": "number", "minimum": 0, "maximum": 1}
},
"required": ["methodology_type", "dataset_source", "year"]
}
}
]
response = openai_client.chat.completions.create(
model="gpt-4-turbo",
messages=messages,
functions=functions,
function_call={"name": "extract_document_metadata"},
temperature=0
)
NgoLwesibili olulodwa, i-OpenAI API yaba nokuphazamiseka kwemizuzu engama-20. Ipayipi lehlulekile kanzima; bekungeyona inkinga yemodeli, ukuncika kwenethiwekhi nje ebesingeke sikubale kahle.
Asikwazanga ukukhiya imodeli, asikwazanga ukusebenza ngokungaxhunyiwe ku-inthanethi, asikwazanga ukuphendula ukuthi “yini okushintshile phakathi komdlalo osebenzayo nokugijima okungasebenzanga?” ngoba imodeli ngakolunye uhlangothi kwakungeyona eyethu.
Ngihlezi lapho ngilinde ukuthi i-API yomunye umuntu ilulame, ekugcineni ngibuze umbuzo okufanele ngabe ngiwubuze ezinyangeni ezedlule: Ingabe lo msebenzi othize udinga ngempela imodeli yasemngceleni?
Amamodeli Asendaweni Angcono Kunokuba Bengikulindele
Ngangena ngilindele ngokugcwele ukuchitha usuku ngiqinisekisa ukuthi abenele. Ngibe nesiphetho esilungile ngaphambi kokuthi ngiqale ukuhlolwa okukodwa: ngikuzamile, ikhwalithi ibingekho, ihlala ku-GPT-4 ngomqondo ongcono wokuzama kabusha. Indaba lapho isinqumo sokuqala sisavikeleka.
Kuthathe cishe amahora amathathu ukubona ukuthi bengicabanga ngalokhu okungalungile.
Ukukhipha idokhumenti ku-schema esigxilile empeleni akuwona umsebenzi onzima wemodeli yolimi. Hhayi ngendlela eyenza imodeli yasemngceleni idingeke. Akukho ukucabanga okuhilelekile, akukho ukuhlanganisa, asikho isidingo sobubanzi bolwazi lomhlaba okwenza i-GPT-4 ibe yinani elibizayo.
Okuyikho ngempela: ukuqonda okuhlelekile kokufunda. Imodeli ifunda idokhumenti bese igcwalisa izinkambu. Imodeli ye-7B eqeqeshwe kahle yenza lokhu kahle. Futhi ngokusethwa okulungile, ikakhulukazi, i-seeded inference, iyakwenza ngokufanayo ukugijima ngakunye.
Ngisebenzise amamodeli amane ngiqhathanisa namadokhumenti angama-50 ebengiwachasisela mathupha:
- I-Phi-3-mini (3.8B): Imiyalo engcono elandelayo kunalokho engangikulindele kumodeli ye-3.8B, kodwa yahlukana kunoma yini engaphezu kwamathokheni angu-3,000. Ngicishe ngikhethe lokhu ngaphambi kokuthi ngibheke imiphumela ende yedokhumenti.
- Imiyalelo ye-Mistral 7B: Iqinile yonke indawo, azikho izimanga zangempela kunoma iyiphi indlela. IToyota Camry yamamodeli asendaweni. Uzoba kahle ngalokhu.
- Qwen2.5-7B-Yala: Lona wawina ngokusobala. Okuphumayo okuhleleke kahle kakhulu kokune ngamajini ububanzi ngokwanele ukuthi bekungelona ucingo oluseduze.
- I-Llama 3.2 3B Yala: Lokhu kuyashesha, kodwa ikhwalithi yokukhipha yehle ngokwanele kumakesi asemaphethelweni kangangokuthi ngeke ngiyisebenzise kudatha yokukhiqiza ngaphandle komsebenzi wokuqinisekisa omningi kakhulu kuqala.
I-Qwen2.5 kanye ne-Mistral zombili zishaye ukunemba okungu-90–95% kusethi yezichasiselo, ephansi kune-GPT-4 kumadokhumenti angacacile ngempela, yebo.
Kodwa ngigijime i-Qwen2.5 kumadokhumenti angama-20 izikhathi ezintathu futhi ngabuyela emuva ukuze ngihlukanise imiphumela. Akukho ukuhluka. I-JSON efanayo, amanani afanayo, i-oda lenkambu efanayo njalo ukugijima.
Ngemva kwamasonto ayisithupha okuhluleka angikwazanga ukubikezela, lokho kwazizwa kuhlanzekile kakhulu ukuba kube ngokoqobo.
Ngaphambi nangemuva
I-extractor ye-GPT-4 ezinze kakhulu, hhayi i-prototype yasekuqaleni, inguqulo ngemva kwezinyanga zekhodi yokuzivikela iqoqwe eduze kwayo:
# extractor_gpt4.py
import os
import json
from openai import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
SYSTEM_PROMPT = """You are a research document metadata extractor.
Given the text of a research document, extract the specified metadata fields.
Be precise. If you are unsure about a field, use your best judgment based on context."""
EXTRACTION_SCHEMA = {
"name": "extract_document_metadata",
"parameters": {
"type": "object",
"properties": {
"methodology_type": {
"type": "string",
"enum": ["experimental", "observational", "review", "simulation", "mixed"]
},
"dataset_source": {"type": "string"},
"primary_metric": {"type": "string"},
"year": {"type": "integer"},
"confidence_score": {"type": "number"}
},
"required": ["methodology_type", "dataset_source", "year"]
}
}
def extract_metadata_gpt4(document_text: str) -> dict:
response = client.chat.completions.create(
model="gpt-4-turbo",
messages=[
{"role": "system", "content": SYSTEM_PROMPT},
{"role": "user", "content": f"Extract metadata from this document:nn{document_text[:8000]}"}
],
functions=[EXTRACTION_SCHEMA],
function_call={"name": "extract_document_metadata"},
temperature=0,
timeout=30
)
try:
args = response.choices[0].message.function_call.arguments
return json.loads(args)
except (AttributeError, json.JSONDecodeError) as e:
raise ValueError(f"Failed to parse GPT-4 response: {e}")
Ukubambezeleka: 3.5–5.8s idokhumenti ngayinye. Izindleko: ~$0.04 ngekholi ngayinye. Izinga lokwehluleka: ~ 6% ngemuva kwakho konke ukulungiswa.
Esikhundleni salokho, ngakhetha u-Ollama. Uzakwethu ukhulume ngokubi, futhi amadokhumenti abukeka enengqondo ngo-11 ebusuku, okuyiqiniso ukuthi izinqumo eziningi zengqalasizinda zithathwa kanjani. I-REST API isondele ngokwanele kwe-OpenAI kangangoba ukushintshwa kuthathe cishe ihora:
# extractor_slm.py
import json
import logging
import os
import re
import requests
logger = logging.getLogger(__name__)
OLLAMA_URL = os.getenv("OLLAMA_URL", "
MODEL = os.getenv("MODEL_NAME", "qwen2.5:7b-instruct-q4_K_M")
# Tested empirically: 7B q4 on a T4 takes ~8s on cold first token,
# then ~1.5s per doc chunk. 45s covers bad days.
_TIMEOUT = 45
_SYSTEM_PROMPT = """
You are a metadata extractor for research documents.
Return ONLY a JSON object — no explanation, no markdown, no surrounding text.
Fields to extract:
- methodology_type (required): one of experimental | observational | review | simulation | mixed
- dataset_source (required): where the data came from
- year (required): integer
- primary_metric: main eval metric if present
- confidence_score: your confidence 0.0–1.0
Output example:
{"methodology_type": "experimental", "dataset_source": "ImageNet", "year": 2022, "primary_metric": "top-1 accuracy", "confidence_score": 0.95}
"""
def call_ollama(doc_text: str) -> dict:
payload = {
"model": MODEL,
"messages": [
{"role": "system", "content": _SYSTEM_PROMPT},
{"role": "user", "content": doc_text[:6000]},
],
"stream": False,
"options": {
"temperature": 0,
"seed": 42, # determinism — this is the whole point
},
}
try:
resp = requests.post(f"{OLLAMA_URL}/api/chat", json=payload, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.exceptions.Timeout:
raise RuntimeError(f"Ollama timed out after {_TIMEOUT}s — is the model loaded?")
except requests.exceptions.ConnectionError:
raise RuntimeError(f"Can't reach Ollama at {OLLAMA_URL} — is the container running?")
raw = resp.json()["message"]["content"].strip()
cleaned = re.sub(r"^```(?:json)?s*|s*```$", "", raw).strip()
try:
return json.loads(cleaned)
except json.JSONDecodeError as e:
logger.error("Failed to parse model output: %snRaw was: %s", e, raw[:400])
raise
I seed: 42 ekukhetheni yilokho empeleni okuletha i-determinism. U-Ollama usekela isizukulwane sembewu; okokufaka okufanayo, imbewu efanayo, okukhiphayo okufanayo, ngaso sonke isikhathi, hhayi cishe. temperature=0 nge-API esingethwe isho lokhu kodwa ayizange ikuqinisekise ngoba awulawuli isikhathi sokusebenza. Endaweni, ukhona.
Ukuyifaka Ezenzweni ze-GitHub
Izinto ezimbili ezingabonakali zize zikulume. Okokuqala: Iziqukathi zesevisi ye-GitHub Actions zihlukaniswe ngenethiwekhi kumgijimi. Awukwazi docker exec phakathi kwabo; ukudonsa kwemodeli kufanele kudlule ku-REST API. Okwesibili: cache imodeli. Ukudonsa okubandayo kungu-4.7GB futhi kwengeza imizuzu engu-3–4 kuwo wonke umsebenzi.
# The non-obvious parts — the rest is standard Actions boilerplate
- name: Cache Ollama model
uses: actions/cache@v4
with:
path: ~/.ollama/models
key: ollama-qwen2.5-7b-q4
- name: Pull SLM model
# Can't docker exec into service containers — use the API
run: |
curl -s
-d '{"name": "qwen2.5:7b-instruct-q4_K_M"}'
--max-time 300
- name: Warm up model
run: |
curl -s
-d '{"model": "qwen2.5:7b-instruct-q4_K_M", "prompt": "hello", "stream": false}'
> /dev/null
- name: Run ingestion pipeline
run: python pipeline/run_ingestion.py
env:
OLLAMA_URL: "
MODEL_NAME: "qwen2.5:7b-instruct-q4_K_M"
I-Capitalization Bug Ngiyisebenzise Isikhathi Eside Kakhulu
I-Qwen 2.5 iyabuya ngezikhathi ezithile "Experimental" ngosonhlamvukazi E naphezu kwemiyalelo ecacile yokungakwenzi. I Literal uhlobo lokuhlola luyakwenqaba, futhi uthola ukwehluleka kokuqinisekisa kudokhumenti imodeli ekhishwe ngendlela efanele.
Ngichithe isikhathi eside ngokuhlazisayo kulokhu ngoba umlayezo wephutha uvele uthi “inani elingavumelekile,” futhi umuzwa wami wokuqala kwakuwukubheka idokhumenti, bese kuba i-extractor, ngaphambi kokuthi ekugcineni ngibheke isiqinisekisi futhi ngisibone.
A normalize_methodology i-validator nge mode="before" iyayilungisa ngaphambi kokuthi uhlobo lokuhlola luqale:
# pipeline/validation.py
from typing import Literal, Optional
from pydantic import BaseModel, Field, field_validator
class DocMetadata(BaseModel):
methodology_type: Literal["experimental", "observational", "review", "simulation", "mixed"]
dataset_source: str = Field(min_length=3)
year: int = Field(ge=1950, le=2030)
primary_metric: Optional[str] = None
confidence_score: Optional[float] = Field(default=None, ge=0.0, le=1.0)
@field_validator("dataset_source")
@classmethod
def strip_whitespace(cls, v: str) -> str:
return v.strip()
@field_validator("methodology_type", mode="before")
@classmethod
def normalize_methodology(cls, v: str) -> str:
# Qwen occasionally returns "Experimental" despite instructions
return v.lower().strip() if isinstance(v, str) else v
Ngifuna ukukhuluma ngokuqondile mayelana nokuhwebelana ngoba ngiyazizonda izindatshana ezingakwenzi lokho.
I-GPT-4 ingcono ngempela kumadokhumenti angaqondakali, umbhalo ongewona owesiNgisi, noma yini edinga ukucatshangelwa kwangempela. Kunesigaba sedokhumenti kukhorasi yethu, ama-PDF amadala, imibiko exutshwa izilimi eziningi, amafomethi angajwayelekile, lapho i-SLM ikhubeka futhi i-GPT-4 ingavumi.
Sihlaba umkhosi labo ngokwehlukana bese sibahambisa kulayini wokubuyekeza. Lokho akunamthungo, kodwa kuthembekile. I-SLM yenza u-90% omuhle kuwo futhi ayibuzwa ukuthi yenze okunye.
Ukusetha nakho kuwumsebenzi omningi kuno pip install openai. Ukucushwa kokuqala kuthatha intambama yangempela. Futhi u-Ollama akawabuyekezi ngokuzenzakalela amamodeli, ngakho ukuphathwa kwenguqulo inkinga yami manje. Ngenze ukuthula ngempela ngalokho, ngazi kahle ukuthi iyiphi inguqulo yemodeli esebenze izolo ebusuku futhi ngobusuku bangaphambili iyona phuzu lonke.
Engikucabangayo Manje
Ekuseni ipayipi liqale lahamba lihlanzekile, akukho saziso, aliphindanga laqhutshwa, ifayela nje lelogi elibonisa imibhalo engu-312 ecutshungulwe emizuzwini engu-8.4, ngalihlola kabili, ngase ngilisebenzisa mathupha futhi.
Ngangichithe amasonto ayisithupha ngilindele isexwayiso ngaphambi kokuthi ngiqede ngisho nekhofi lami. Ukuyibuka kudlula ethule kwazwakala kuxakile ngempela.
Ukwehluleka bekungasebenzisi i-GPT-4. Ukwehluleka bekuwukuphatha isistimu engenzeka njengomsebenzi wokunquma. temperature=0 kunciphisa ukuhluka. Ayikuqedi.
Ngangikuqonda lokhu ngokombono ngaso sonke isikhathi. Kuthathe ukwehluleka okungu-23 ukuyiqonda ngendlela eyashintsha indlela engizakha ngayo izinto.
Uma usebenzisa i-LLM ngendlela ezenzakalelayo futhi izinto ziqhubeka zephuka ngezindlela ongakwazi ukuphinda ukhiqize, kungase kungabi ikhodi yakho noma ukwaziswa kwakho. Kungaba yimvelo yalokho othembele kukho.
Imodeli yendawo yezingxenyekazi zekhompuyutha ozilawulayo, efakwe imbewu ye-determinism, iyithuluzi elihluke ngempela lalolu hlobo lomsebenzi. Hhayi kangcono kukho konke kodwa kungcono ekubeni into efanayo kabili. Futhi ngepayipi elihamba lingagadiwe njalo ebusuku, lokho iningi lalokho okubalulekile.
Ngaphambi kokuthi uhambe!
Uma lokhu bekuwusizo, ngibhala okwengeziwe mayelana neqiniso elingcolile ngemuva kokwakha nge-AI, yini ephukayo ekukhiqizeni, yini ngempela esebenzayo, kanye ne-tradeoffs ngemuva kokulungiswa.
Ungakwazi bhalisela okwami iphephandaba uma ungathanda okwengeziwe kwalokho.



