Conversation

joecummings

No description provided.

encoder_output = model_kwargs["encoder_outputs"][encoder_output_key]

def update_func(emissions, N, T, prev_step_token_idxs, prev_step_model_states, timestep):
# `emissions` and `N` are unused in this current implementation
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could change this around to take in emissions, but AFAIK there is no easy way to get the actual tensor from the data_ptr.


return final_tokens_as_tensors

if num_python_workers > 1:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PyTorch has a MP module (essentially a clone of Python's MP); however, this relies on pickle, which means all of these functions would have to be at a global level. Very open to suggestions here.

@joecummingsjoecummings force-pushed the beam-search branch 2 times, most recently from e2b999c to 9f2d2d6 Compare February 10, 2023 18:21
Sign up for free to join this conversation on . Already have an account? Sign in to comment
None yet

Successfully merging this pull request may close these issues.