Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Newlines in prompts, python<3.12 support, parallel patched requests #2

Closed
wants to merge 4 commits into from

Conversation

bjj
Copy link
Contributor

@bjj bjj commented Feb 5, 2024

Sorry I didn't split this up, but things don't seem so fancy yet that it's necessary.

Thanks for getting the ball rolling. I was going to do something similar so it was interesting to try it out.

bjj added 4 commits February 4, 2024 19:22
This does not batch across models because my target is a local server
that has to reload between models. If targeting openai or openrouter
it would make sense to add another layer of batching.
Accounts for ollama-style repository:tag names. Somehow the original version works on windows (you can write and read back the files) but it's not easy to access them with other tools.
@Contextualist
Copy link
Owner

👋Hi @bjj , thanks for your PR!

  1. 59abc91 Explicit newline is natively supported. e.g.
    [[prompt]]
    name = "multiline-example"
    chat = """
    system: You are an assistant.
    user: Could you answer the following questions?
    a. 1+1=?
    b. 1^1=?
    assistant: I'm not sure.
    user: Which part is unclear?
    """
    I'll make it clearer in the example.
  2. 239610b Sorry that I'm a bit opinionated on this one. I expect user to be able to create a new Python 3.12 virtual environment for installation.
  3. 78e72ea Batching is nice! I will take a closer look soon.
  4. 10fdff0 This is probably a confusion. model[_].name is just an internal tag (i.e. it will not get sent to the endpoint), and it should not be confused with model[_].model.

Let me know if you have further questions/comments on 1, 2, 4, otherwise please open a new PR with 3.

@bjj
Copy link
Contributor Author

bjj commented Feb 5, 2024

  1. Ah, I misread the code. I see how it works now.
  2. Ok
  3. Feel free to cherry pick or not.
  4. I ended up matching them to model names which causes them to be used as filenames (which is not immediately obvious to a user).

@Contextualist
Copy link
Owner

Closed as I cherry-picked 78e72ea as a4cea86. Thank you again for taking time navigating through the code and making improvements. A pair of fresh eyes means a lot to me for figuring out what doesn’t make sense. There will be follow-up documentation enhancement based on your feedback on 1 and 4.

Contextualist added a commit that referenced this pull request Feb 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants