Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Specify absolute path #352

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion PrepareVicuna.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ pip install git+https://github.com/lm-sys/[email protected]
Then, run the following command to create the final working weight

```
python -m fastchat.model.apply_delta --base /path/to/llama-13bOR7b-hf/ --target /path/to/save/working/vicuna/weight/ --delta /path/to/vicuna-13bOR7b-delta-v0/
python -m fastchat.model.apply_delta --base /absolute/path/to/llama-13bOR7b-hf/ --target /absolute/path/to/save/working/vicuna/weight/ --delta /absolute/path/to/vicuna-13bOR7b-delta-v0/
```

Now you are good to go!
Expand Down
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,9 +64,9 @@ Download the corresponding LLM weights from the following huggingface space via
[Downlad](https://huggingface.co/Vision-CAIR/vicuna/tree/main) | [Download](https://huggingface.co/Vision-CAIR/vicuna-7b/tree/main) | [Download](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf/tree/main)


Then, set the path to the vicuna weight in the model config file
Then, set the absolute path to the vicuna weight in the model config file
[here](minigpt4/configs/models/minigpt4_vicuna0.yaml#L18) at Line 18
and/or the path to the llama2 weight in the model config file
and/or the absolute path to the llama2 weight in the model config file
[here](minigpt4/configs/models/minigpt4_llama2.yaml#L15) at Line 15.

**3. Prepare the pretrained MiniGPT-4 checkpoint**
Expand All @@ -78,7 +78,7 @@ Download the pretrained checkpoints according to the Vicuna model you prepare.
[Downlad](https://drive.google.com/file/d/1a4zLvaiDBr-36pasffmgpvH5P7CKmpze/view?usp=share_link) | [Download](https://drive.google.com/file/d/1RY9jV0dyqLX-o38LrumkKRh6Jtaop58R/view?usp=sharing) | [Download](https://drive.google.com/file/d/11nAPjEok8eAGGEG1N2vXo3kBLCg0WgUk/view?usp=sharing)


Then, set the path to the pretrained checkpoint in the evaluation config file
Then, set the absolute path to the pretrained checkpoint in the evaluation config file
in [eval_configs/minigpt4_eval.yaml](eval_configs/minigpt4_eval.yaml#L10) at Line 8 for Vicuna version or [eval_configs/minigpt4_llama2_eval.yaml](eval_configs/minigpt4_llama2_eval.yaml#L10) for LLama2 version.


Expand Down Expand Up @@ -118,7 +118,7 @@ our [first stage dataset preparation instruction](dataset/README_1_STAGE.md).
After the first stage, the visual features are mapped and can be understood by the language
model.
To launch the first stage training, run the following command. In our experiments, we use 4 A100.
You can change the save path in the config file
You can change the relative save path in the config file
[train_configs/minigpt4_stage1_pretrain.yaml](train_configs/minigpt4_stage1_pretrain.yaml)

```bash
Expand All @@ -137,9 +137,9 @@ and convert it to a conversation format to further align MiniGPT-4.
To download and prepare our second stage dataset, please check our
[second stage dataset preparation instruction](dataset/README_2_STAGE.md).
To launch the second stage alignment,
first specify the path to the checkpoint file trained in stage 1 in
first specify the absolute path to the checkpoint file trained in stage 1 in
[train_configs/minigpt4_stage1_pretrain.yaml](train_configs/minigpt4_stage2_finetune.yaml).
You can also specify the output path there.
You can also specify the relative output path there.
Then, run the following command. In our experiments, we use 1 A100.

```bash
Expand Down
6 changes: 3 additions & 3 deletions dataset/README_1_STAGE.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ laion_synthetic_filtered_large.json

### setup the dataset folder and move the annotation file to the data storage folder
```
export MINIGPT4_DATASET=/YOUR/PATH/FOR/LARGE/DATASET/
export MINIGPT4_DATASET=/YOUR/ABSOLUTE/PATH/FOR/LARGE/DATASET/
mkdir ${MINIGPT4_DATASET}/cc_sbu
mkdir ${MINIGPT4_DATASET}/laion
mv ccs_synthetic_filtered_large.json ${MINIGPT4_DATASET}/cc_sbu
Expand Down Expand Up @@ -84,11 +84,11 @@ The final dataset structure

## Set up the dataset configuration files

Then, set up the LAION dataset loading path in
Then, set up the absolute LAION dataset loading path in
[here](../minigpt4/configs/datasets/laion/defaults.yaml#L5) at Line 5 as
${MINIGPT4_DATASET}/laion/laion_dataset/{00000..10488}.tar

and the Conceptual Captoin and SBU datasets loading path in
and the absolute Conceptual Captoin and SBU datasets loading path in
[here](../minigpt4/configs/datasets/cc_sbu/defaults.yaml#L5) at Line 5 as
${MINIGPT4_DATASET}/cc_sbu/cc_sbu_dataset/{00000..01255}.tar

Expand Down
2 changes: 1 addition & 1 deletion dataset/README_2_STAGE.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,6 @@ cc_sbu_align
```

Put the folder to any path you want.
Then, set up the dataset path in the dataset config file
Then, set up the absolute dataset path in the dataset config file
[here](../minigpt4/configs/datasets/cc_sbu/align.yaml#L5) at Line 5.

2 changes: 1 addition & 1 deletion eval_configs/minigpt4_llama2_eval.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ model:
end_sym: "</s>"
low_resource: True
prompt_template: '[INST] {} [/INST] '
ckpt: '/path/to/checkpoint/'
ckpt: '/absolute/path/to/checkpoint/'


datasets:
Expand Down
2 changes: 1 addition & 1 deletion minigpt4/configs/datasets/cc_sbu/align.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ datasets:
cc_sbu_align:
data_type: images
build_info:
storage: /path/to/cc_sbu_align/
storage: /absolute/path/to/cc_sbu_align/
2 changes: 1 addition & 1 deletion minigpt4/configs/datasets/cc_sbu/defaults.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ datasets:
cc_sbu:
data_type: images
build_info:
storage: /path/to/cc_sbu_dataset/{00000..01255}.tar
storage: /absolute/path/to/cc_sbu_dataset/{00000..01255}.tar
2 changes: 1 addition & 1 deletion minigpt4/configs/datasets/laion/defaults.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ datasets:
laion:
data_type: images
build_info:
storage: /path/to/laion_dataset/{00000..10488}.tar
storage: /absolute/path/to/laion_dataset/{00000..10488}.tar
2 changes: 1 addition & 1 deletion minigpt4/configs/models/minigpt4_llama2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ model:
# generation configs
prompt: ""

llama_model: "/path/to/llama2/weight"
llama_model: "/absolute/path/to/llama2/weight"

preprocess:
vis_processor:
Expand Down
2 changes: 1 addition & 1 deletion minigpt4/configs/models/minigpt4_vicuna0.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ model:
# generation configs
prompt: ""

llama_model: "/path/to/vicuna/weight"
llama_model: "/absolute/path/to/vicuna/weight"

preprocess:
vis_processor:
Expand Down
2 changes: 1 addition & 1 deletion train_configs/minigpt4_llama2_stage2_finetune.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ model:
end_sym: "</s>"
prompt_path: "prompts/alignment.txt"
prompt_template: '[INST] {} [/INST] '
ckpt: '/path/to/stage1/checkpoint/'
ckpt: '/absolute/path/to/stage1/checkpoint/'


datasets:
Expand Down
2 changes: 1 addition & 1 deletion train_configs/minigpt4_stage2_finetune.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ model:
end_sym: "###"
prompt_path: "prompts/alignment.txt"
prompt_template: '###Human: {} ###Assistant: '
ckpt: '/path/to/stage1/checkpoint/'
ckpt: '/absolute/path/to/stage1/checkpoint/'


datasets:
Expand Down