document model integrations #22

Open
opened 6 months ago by utopiah · 2 comments
Owner

cf https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence

example in Python using Flask and UUID :

# from ../LCM/genimg.py
import uuid    
from flask import request, Flask
app = Flask(__name__)

from diffusers import DiffusionPipeline
import torch

pipe = DiffusionPipeline.from_pretrained("SimianLuo/LCM_Dreamshaper_v7", custom_pipeline="latent_consistency_txt2img", custom_revision="main", revision="main")

# To save GPU memory, torch.float16 can be used, but it may compromise image quality.
pipe.to(torch_device="cuda", torch_dtype=torch.float32)

# Can be set to 1~50 steps. LCM support fast inference even <= 4 steps. Recommend: 1~8 steps.
num_inference_steps = 8 
# could become a parameter

from flask import request
@app.route('/query/<prompt>', methods=['GET'])
def login(prompt):
    myuuid = uuid.uuid4().hex
    images = pipe(prompt=prompt, num_inference_steps=num_inference_steps, guidance_scale=8.0, lcm_origin_steps=50, output_type="pil").images
    images[0].save('./static/'+myuuid+'.jpg','JPEG')
    return {'prompt': prompt, 'url': '/static/'+myuuid+'.jpg'}

if __name__ == '__main__':
    app.run()
    print('can then expose as https via e.g ngrok http 5000')

cf https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence example in Python using Flask and UUID : ```python # from ../LCM/genimg.py import uuid from flask import request, Flask app = Flask(__name__) from diffusers import DiffusionPipeline import torch pipe = DiffusionPipeline.from_pretrained("SimianLuo/LCM_Dreamshaper_v7", custom_pipeline="latent_consistency_txt2img", custom_revision="main", revision="main") # To save GPU memory, torch.float16 can be used, but it may compromise image quality. pipe.to(torch_device="cuda", torch_dtype=torch.float32) # Can be set to 1~50 steps. LCM support fast inference even <= 4 steps. Recommend: 1~8 steps. num_inference_steps = 8 # could become a parameter from flask import request @app.route('/query/<prompt>', methods=['GET']) def login(prompt): myuuid = uuid.uuid4().hex images = pipe(prompt=prompt, num_inference_steps=num_inference_steps, guidance_scale=8.0, lcm_origin_steps=50, output_type="pil").images images[0].save('./static/'+myuuid+'.jpg','JPEG') return {'prompt': prompt, 'url': '/static/'+myuuid+'.jpg'} if __name__ == '__main__': app.run() print('can then expose as https via e.g ngrok http 5000') ```
Poster
Owner

note that this a template, the idea being to expose via HTTP a service

it might be interesting to look at Gradio but overall this is usually for programatic usage, i.e endpoint consumption with e.g JSON, so might not be needed

note that this a template, the idea being to expose via HTTP a service it might be interesting to look at Gradio but overall this is usually for programatic usage, i.e endpoint consumption with e.g JSON, so might not be needed
Poster
Owner
see also https://twitter.com/utopiah/status/1720122249938628951
Sign in to join this conversation.
No Milestone
No project
No Assignees
1 Participants
Notifications
Due Date

No due date set.

Dependencies

No dependencies set.

Reference: utopiah/offline-octopus#22
Loading…
There is no content yet.