Introducing Gradio Clients
WatchIntroducing Gradio Clients
WatchNew to Gradio? Start here: Getting Started
See the Release History
To install Gradio from main, run the following command:
pip install https://gradio-builds.s3.amazonaws.com/278645b649fb590e6c9608c568ee0903c735a536/gradio-5.0.0b3-py3-none-any.whl
*Note: Setting share=True
in
launch()
will not work.
gradio.load(···)
css
, js
, and head
attributes) will not be loaded.import gradio as gr
demo = gr.load("gradio/question-answering", src="spaces")
demo.launch()
name: str
the name of the model (e.g. "gpt2" or "facebook/bart-base") or space (e.g. "flax-community/spanish-gpt2"), can include the `src` as prefix (e.g. "models/facebook/bart-base")
src: str | None
= None
the source of the model: `models` or `spaces` (or leave empty if source is provided as a prefix in `name`)
hf_token: str | Literal[False] | None
= False
optional Hugging Face token for loading private models or Spaces. By default, no token is sent to the server, set `hf_token=None` to use the locally saved token if there is one (warning: when loading Spaces, only provide a token if you are loading a trusted private Space as the token can be read by the Space you are loading). Find your tokens here: https://huggingface.co/settings/tokens.
alias: str | None
= None
optional string used as the name of the loaded model instead of the default name (only applies if loading a Space running Gradio 2.x)