Skip to content

Commit ba8ad7d

Browse files
committed
update some important documentationy bits
1 parent 5ba984e commit ba8ad7d

3 files changed

Lines changed: 52 additions & 47 deletions

File tree

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,8 @@ var/
2323
.installed.cfg
2424
*.egg
2525

26+
.env
27+
2628
# PyInstaller
2729
# Usually these files are written by a python script from a template
2830
# before PyInstaller builds the exe, so as to inject date/other infos into it.

README.md

Lines changed: 48 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -8,45 +8,23 @@ A comprehensive membership evaluations solution for Computer Science House.
88
Development
99
-----------
1010

11-
### Config
11+
## Running (containerized)
1212

13-
You must create `config.py` in the top-level directory with the appropriate credentials for the application to run. See `config.env.py` for an example.
14-
15-
#### Add OIDC Config
16-
Reach out to an RTP to get OIDC credentials that will allow you to develop locally behind OIDC auth
17-
```py
18-
# OIDC Config
19-
OIDC_ISSUER = "https://sso.csh.rit.edu/auth/realms/csh"
20-
OIDC_CLIENT_CONFIG = {
21-
'client_id': '',
22-
'client_secret': '',
23-
'post_logout_redirect_uris': ['http://0.0.0.0:6969/logout']
24-
}
25-
```
26-
27-
#### Add S3 Config
28-
An S3 bucket is used to store files that users upload (currently just for major project submissions). In order to have this work properly, you need to provide some credentials to the app.
13+
It is likely easier to use containers like `podman` or `docker` or the corresponding compose file
2914

30-
There are 2 ways that you can get the needed credentials.
31-
1. Reach out to an RTP for creds to the dev bucket
32-
2. Create your own bucket using [DEaDASS](https://deadass.csh.rit.edu/), and the site will give you the credentials you need.
15+
With podman, I have been using
3316

34-
```py
35-
S3_URI = env.get("S3_URI", "https://s3.csh.rit.edu")
36-
S3_BUCKET_ID = env.get("S3_BUCKET_ID", "major-project-media")
37-
AWS_ACCESS_KEY_ID = env.get("AWS_ACCESS_KEY_ID", "")
38-
AWS_SECRET_ACCESS_KEY = env.get("AWS_SECRET_ACCESS_KEY", "")
17+
```sh
18+
podman compose up --watch
3919
```
4020

41-
#### Database
42-
You can either develop using the dev database, or use the local database provided in the docker compose file
43-
44-
Using the local database is detailed below, but both options will require the dev database password, so you will have to ask an RTP for this too
45-
46-
#### Forcing evals/rtp or anything else
47-
All of the role checking is done in `conditional/utils/user_dict.py`, and you can change the various functions to `return True` for debugging
21+
If you want, you can run without auto rebuild using
22+
```sh
23+
podman compose up --force-recreate --build
24+
```
25+
Which can be restarted every time changes are made.
4826

49-
### Run (Without Docker)
27+
## Run (Without Docker)
5028

5129
To run the application without using containers, you must have the latest version of [Python 3](https://www.python.org/downloads/) and [virtualenv](https://virtualenv.pypa.io/en/stable/installation/) installed. Once you have those installed, create a new virtualenv and install the Python dependencies:
5230

@@ -90,30 +68,53 @@ or
9068
python -m gunicorn
9169
```
9270

93-
### Run (containerized)
71+
## Config
9472

95-
It is likely easier to use containers like `podman` or `docker` or the corresponding compose file
96-
97-
With podman, I have been using
73+
You must create `config.py` in the top-level directory with the appropriate credentials for the application to run. See `config.env.py` for an example.
9874

99-
```sh
100-
podman compose up --watch
75+
### Add OIDC Config
76+
Reach out to an RTP to get OIDC credentials that will allow you to develop locally behind OIDC auth
77+
```py
78+
# OIDC Config
79+
OIDC_ISSUER = "https://sso.csh.rit.edu/auth/realms/csh"
80+
OIDC_CLIENT_CONFIG = {
81+
'client_id': '',
82+
'client_secret': '',
83+
'post_logout_redirect_uris': ['http://0.0.0.0:6969/logout']
84+
}
10185
```
10286

103-
If you want, you can run without compose support using
104-
```sh
105-
podman compose up --force-recreate --build
87+
### Add S3 Config
88+
An S3 bucket is used to store files that users upload (currently just for major project submissions). In order to have this work properly, you need to provide some credentials to the app.
89+
90+
There are 2 ways that you can get the needed credentials.
91+
1. Reach out to an RTP for creds to the dev bucket
92+
2. Create your own bucket using [DEaDASS](https://deadass.csh.rit.edu/), and the site will give you the credentials you need.
93+
94+
```py
95+
S3_URI = env.get("S3_URI", "https://s3.csh.rit.edu")
96+
S3_BUCKET_ID = env.get("S3_BUCKET_ID", "major-project-media")
97+
AWS_ACCESS_KEY_ID = env.get("AWS_ACCESS_KEY_ID", "")
98+
AWS_SECRET_ACCESS_KEY = env.get("AWS_SECRET_ACCESS_KEY", "")
10699
```
107100

108-
Which can be restarted every time changes are made
101+
### Database
102+
You can either develop using the dev database, or use the local database provided in the docker compose file
103+
104+
Using the local database is detailed below, but both options will require the dev database password, so you will have to ask an RTP for this too
105+
106+
### Forcing evals/rtp or anything else
107+
All of the role checking is done in `conditional/utils/user_dict.py`, and you can change the various functions to `return True` for debugging
108+
109+
109110

110-
### Dependencies
111+
## Dependencies
111112

112113
To add new dependencies, add them to `requirements.in` and then run `pip-compile requirements.in` to produce a new locked `requirements.txt`. Do not edit `requirements.txt` directly as it will be overwritten by future PRs.
113114

114-
### Database Stuff
115+
## Database Stuff
115116

116-
#### Local database
117+
### Local database
117118

118119
You can run the database locally using the docker compose
119120

@@ -130,7 +131,7 @@ To run migration commands in the local database, you can run the commands inside
130131
podman exec conditional flask db upgrade
131132
```
132133

133-
#### Database Migrations
134+
### Database Migrations
134135

135136
If the database schema is changed after initializing the database, you must migrate it to the new schema by running:
136137

docker-compose.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ services:
77
- conditional-postgres
88
ports:
99
- "127.0.0.1:8080:8080"
10+
env_file:
11+
- .env
1012
volumes:
1113
- ./migrations:/opt/conditional/migrations
1214
develop:

0 commit comments

Comments
 (0)