Helper function to get a file, using either the Fetch API or FileSystem API.
Kind: static method of utils/hubReturns: Promise.<(FileResponse|Response)> - A promise that resolves to a FileResponse object (if the file is retrieved using the FileSystem API), or a Response object (if the file is retrieved using the Fetch API).
Retrieves a file from either a remote URL using the Fetch API or from the local file system using the FileSystem API. If the filesystem is available and env.useCache = true, the file will be downloaded and cached.
Kind: static method of utils/hubReturns: Promise - A Promise that resolves with the file content as a buffer.
Throws:
Will throw an error if the file is not found and fatal is true.
Param
Type
Default
Description
path_or_repo_id
string
This can be either:
a string, the model id of a model repo on boincai.com.
a path to a directory potentially containing the file.
filename
string
The name of the file to locate in path_or_repo.
[fatal]
boolean
true
Whether to throw an error if the file is not found.
Updates the βcontent-typeβ header property of the response based on the extension of the file specified by the filePath property of the current object.
Reads the contents of the file specified by the filePath property and returns a Promise that resolves with an ArrayBuffer containing the fileβs contents.
Kind: instance method of FileResponseReturns: Promise.<ArrayBuffer> - A Promise that resolves with an ArrayBuffer containing the fileβs contents.
Throws:
Reads the contents of the file specified by the filePath property and returns a Promise that resolves with a parsed JavaScript object containing the fileβs contents.
Kind: instance method of FileResponseReturns: Promise.<Object> - A Promise that resolves with a parsed JavaScript object containing the fileβs contents.
Throws:
Whether to load the 8-bit quantized version of the model (only applicable when loading model files).
[progress_callback]
function
If specified, this function will be called during model construction, to provide the user with progress updates.
[config]
Object
Configuration for the model to use instead of an automatically loaded configuration. Configuration can be automatically loaded when:
The model is a model provided by the library (loaded with the model id string of a pretrained model).
The model is loaded by supplying a local directory as pretrained_model_name_or_path and a configuration JSON file named config.json is found in the directory.
[cache_dir]
string
null
Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used.
[local_files_only]
boolean
false
Whether or not to only look at local files (e.g., not try downloading the model).
[revision]
string
"'main'"
The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on boincai.com, so revision can be any identifier allowed by git. NOTE: This setting is ignored for local requests.
[model_file_name]
string
null
If specified, load the model with this name (excluding the .onnx suffix). Currently only valid for encoder- or decoder-only models.