You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run the command ai-edge-torch/ai_edge_torch/generative/examples/llama$ python convert_to_tflite.py
I just change
_CHECKPOINT_PATH = flags.DEFINE_string(
'checkpoint_path',
- os.path.join(pathlib.Path.home(), 'Downloads/llm_data/llama'),
+ os.path.join(pathlib.Path.home(), 'Downloads/llama'),
'The path to the model checkpoint, or directory holding the checkpoint.',
)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/export/exported_program.py", line 114, in wrapper
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/export/_trace.py", line 1880, in _export
export_artifact = export_func( # type: ignore[operator]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/export/_trace.py", line 1224, in _strict_export
return _strict_export_lower_to_aten_ir(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/export/_trace.py", line 1333, in _strict_export_lower_to_aten_ir
aten_export_artifact = lower_to_aten_callback(
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/export/_trace.py", line 637, in _export_to_aten_ir
gm, graph_signature = transform(aot_export_module)(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 1246, in aot_export_module
fx_g, metadata, in_spec, out_spec = _aot_export_function(
^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 1480, in _aot_export_function
fx_g, meta = create_aot_dispatcher_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 522, in create_aot_dispatcher_function
return _create_aot_dispatcher_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 623, in _create_aot_dispatcher_function
fw_metadata = run_functionalized_fw_and_collect_metadata(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/collect_metadata_analysis.py", line 173, in inner
flat_f_outs = f(*flat_f_args)
^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/utils.py", line 182, in flat_fn
tree_out = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/traced_function_transforms.py", line 859, in functional_call
out = PropagateUnbackedSymInts(mod).run(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/fx/interpreter.py", line 146, in run
self.env[node] = self.run_node(node)
^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/fx/experimental/symbolic_shapes.py", line 5498, in run_node
result = super().run_node(n)
^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/fx/interpreter.py", line 203, in run_node
return getattr(self, n.op)(n.target, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/fx/interpreter.py", line 275, in call_function
return target(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Cannot set version_counter for inference tensor
While executing %getitem_8 : [num_users=1] = call_function[target=operator.getitem](args = (%int_2, 0), kwargs = {})
Original traceback:
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/utilities/converter.py", line 39, in forward
return self.module(*export_args, **full_kwargs)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/utilities/model_builder.py", line 121, in forward
return self._forward_with_embeds(
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/utilities/model_builder.py", line 147, in _forward_with_embeds
x, kv_entry = block(x, rope, mask, input_pos, kv_entry)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/layers/attention.py", line 95, in forward
atten_func_out = self.atten_func(x_norm, rope, mask, input_pos, kv_cache)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/layers/attention.py", line 218, in forward
kv_cache = kv_utils.update(kv_cache, input_pos, k, v)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/layers/kv_cache.py", line 164, in update
return update_kv_cache(cache, input_pos, k_slice, v_slice)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/layers/kv_cache.py", line 197, in _update_kv_impl
k_slice_indices = _get_slice_indices(input_pos)
File "/home/haseok/miniconda3/lib/python3.11/site-packages/ai_edge_torch/generative/layers/kv_cache.py", line 184, in _get_slice_indices
positions = positions.int()[0].reshape([])
Is there any one have this problem?
Actual vs expected behavior:
No response
Any other information you'd like to share?
additionally, I got fail at verify.py.
is this error related that error?
if not, can you tell me why failed at the "verify"? I didn't change 'verify.py' at all.
Description of the bug:
ai_edge_torch.version
'0.3.0.dev20250105'
torch.version
'2.5.1+cu124'
python version
3.11.9
llama version
https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct
I run the command
ai-edge-torch/ai_edge_torch/generative/examples/llama$ python convert_to_tflite.py
I just change
Is there any one have this problem?
Actual vs expected behavior:
No response
Any other information you'd like to share?
additionally, I got fail at verify.py.
is this error related that error?
if not, can you tell me why failed at the "verify"? I didn't change 'verify.py' at all.
The text was updated successfully, but these errors were encountered: