Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add py bindings for encrypted models and sample #1751

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

olpipi
Copy link
Collaborator

@olpipi olpipi commented Feb 17, 2025

No description provided.

@github-actions github-actions bot added category: LLM LLM pipeline (stateful, static) category: Python API Python API for GenAI category: samples GenAI samples labels Feb 17, 2025
@github-actions github-actions bot added category: GHA CI based on Github actions category: tokenizers Tokenizer class or submodule update category: GenAI C++ API Changes in GenAI C++ public headers labels Feb 18, 2025
@olpipi olpipi changed the title poc for encrypted_model_causal_lm.py Add py bindings for encrypted models and sample Feb 18, 2025
@olpipi olpipi marked this pull request as ready for review February 18, 2025 14:47
@olpipi olpipi requested review from Wovchena and ilya-lavrenov and removed request for Wovchena February 18, 2025 14:47
parser.add_argument('prompt')
args = parser.parse_args()

model, weights = read_model(args.model_dir, 'openvino_model.xml', 'openvino_model.bin')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
model, weights = read_model(args.model_dir, 'openvino_model.xml', 'openvino_model.bin')
model, weights = decrypt_model(args.model_dir, 'openvino_model.xml', 'openvino_model.bin')

as we need to "emulate" that we perform encryption

the same for tokenizer model

@@ -774,6 +774,7 @@ jobs:
run: |
source ./ov/setupvars.sh
timeout 30s ./build/samples/cpp/text_generation/encrypted_model_causal_lm ./TinyLlama-1.1B-Chat-v1.0/ "Why is the sun yellow?"
timeout 30s python ./samples/python/text_generation/encrypted_model_causal_lm.py ./TinyLlama-1.1B-Chat-v1.0/ "Why is the sun yellow?"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Verify that the outputs are the same. I think it's best to use tee so we see the full output and diff after that

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

<<< $'Describe the images?' | tee py.txt

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sample tests moved to pytest in #1661. Please, do the same

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: GenAI C++ API Changes in GenAI C++ public headers category: GHA CI based on Github actions category: LLM LLM pipeline (stateful, static) category: Python API Python API for GenAI category: samples GenAI samples category: tokenizers Tokenizer class or submodule update
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants