如下是怎样天生miniF2F数据集合成就证实的一个简朴示例:
from transformers import AutoModelForCausalLM, Autotokenizerimport torchtorch.manual_seed(30)model_id = "DeepSeek-Prover-V2-7B" # or DeepSeek-Prover-V2-671Btokenizer = Autotokenizer.from_pretrained(model_id)formal_statement = """import Mathlibimport Aesopset_option maxHeartbeats 0open BigOperators Real Nat Topology Rat/-- What is the positive difference between $120\%$ of 30 and $130\%$ of 20? Show that it is 10.-/theorem mathd_algebra_10 : abs ((120 : ℝ) / 100 * 30 - 130 / 100 * 20) = 10 := by sorry""".strip()prompt = """Complete the following Lean 4 code:```lean4{}```Before producing the Lean 4 code to formally prove the given theorem, provide a detailed proof plan outlining the main proof steps and strategies.The plan should highlight key ideas, intermediate le妹妹as, and proof structures that will guide the construction of the final formal proof.""".strip()chat = [ {"role": "user", "content": prompt.format(formal_statement)},]model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype=torch.bfloat16, trust_remote_code=True)inputs = tokenizer.apply_chat_template(chat, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device)import timestart = time.time()outputs = model.generate(inputs, max_new_token=8192)print(tokenizer.batch_decode(outputs))print(time.time() - start)参照质料:https://github.com/deepseek-ai/DeepSeek-Prover-V2/tree/main举荐浏览