PHP前端开发

类型错误:llama_tokenize() 缺少 2 个必需的位置参数:“add_bos”和“special”

百变鹏仔 1个月前 (01-18) #Python
文章标签 错误
问题内容

我正在运行 python 3.11 和最新版本的 llama-cpp-python 以及 一个 gguf 模型

我希望代码像聊天机器人一样正常运行,但我得到了这个错误:

traceback (most recent call last):  file "d:i customi arushserver.py", line 223, in <module>    init()  file "d:i customi arushserver.py", line 57, in init    m_eval(model, m_tokenize(model, prompt_init, true), false, "starting up...")                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^  file "d:i customi arushserver.py", line 182, in m_tokenize    n_tokens = llama_cpp.llama_tokenize(               ^^^^^^^^^^^^^^^^^^^^^^^^^typeerror: llama_tokenize() missing 2 required positional arguments: 'add_bos' and 'special'</module>

这是我的标记化代码:

def m_tokenize(model: llama_cpp.Llama, text: bytes, add_bos=False, special=False):    assert model.ctx is not None    n_ctx = llama_cpp.llama_n_ctx(model.ctx)    tokens = (llama_cpp.llama_token * int(n_ctx))()    n_tokens = llama_cpp.llama_tokenize(        model.ctx,        text,        tokens,        n_ctx,        llama_cpp.c_bool(add_bos),    )    if int(n_tokens) <br><h2 class="daan">正确答案</h2><br><pre class="brush:php;toolbar:false;">typeerror: llama_tokenize() missing 2 required positional arguments: 'add_bos' and 'special'

要解决该错误,您需要将参数 add_bos 和 special 包含到 llama_tokenize() 函数中。

def m_tokenize(model: llama_cpp.llama, text: bytes, add_bos=false, special=false):    assert model.ctx is not none    n_ctx = llama_cpp.llama_n_ctx(model.ctx)    tokens = (llama_cpp.llama_token * int(n_ctx))()        # include the missing arguments in the function call    n_tokens = llama_cpp.llama_tokenize(        model.ctx,        text,        tokens,        n_ctx,        # you should check if llama_cpp.c_bool(add_bos) is returning a c_boo value also you have the arguments add_bos=false and special=false in this function         # if i am right all you need is:        add_bos        # not        # llama_cpp.c_bool(add_bos),        # you should check if llama_cpp.c_bool(special) is returning a c_boo value        # if i am right all you need is:        special  # include the special argument        # not         # llama_cpp.c_bool(special)     )        if int(n_tokens) <p>来自 <a href="https://www.php.cn/link/fcbc95ccdd551da181207c0c1400c655" rel="nofollow noreferrer">llama_cpp.py (github) a&gt;,从 1817 开始的代码行</a></p><pre class="brush:php;toolbar:false;">def llama_tokenize(    model: llama_model_p,    text: bytes,    text_len: Union[c_int, int],    tokens,  # type: Array[llama_token]    n_max_tokens: Union[c_int, int],    add_bos: Union[c_bool, bool],    special: Union[c_bool, bool],) -&gt; int:    """Convert the provided text into tokens."""    return _lib.llama_tokenize(        model, text, text_len, tokens, n_max_tokens, add_bos, special    )_lib.llama_tokenize.argtypes = [    llama_model_p,    c_char_p,    c_int32,    llama_token_p,    c_int32,    c_bool,    c_bool,]_lib.llama_tokenize.restype = c_int32