Skip to content

Commit 78dd2e0

Browse files
fix evals stuff (#552)
* fix evals stuff Spring evals errors Housing point stuff * update some important documentationy bits * fixed migration --------- Co-authored-by: Tyler Allen <tyler@tallen.me>
1 parent c1e6ae5 commit 78dd2e0

File tree

7 files changed

+58
-63
lines changed

7 files changed

+58
-63
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,8 @@ var/
2323
.installed.cfg
2424
*.egg
2525

26+
.env
27+
2628
# PyInstaller
2729
# Usually these files are written by a python script from a template
2830
# before PyInstaller builds the exe, so as to inject date/other infos into it.

README.md

Lines changed: 48 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -8,45 +8,23 @@ A comprehensive membership evaluations solution for Computer Science House.
88
Development
99
-----------
1010

11-
### Config
11+
## Running (containerized)
1212

13-
You must create `config.py` in the top-level directory with the appropriate credentials for the application to run. See `config.env.py` for an example.
14-
15-
#### Add OIDC Config
16-
Reach out to an RTP to get OIDC credentials that will allow you to develop locally behind OIDC auth
17-
```py
18-
# OIDC Config
19-
OIDC_ISSUER = "https://sso.csh.rit.edu/auth/realms/csh"
20-
OIDC_CLIENT_CONFIG = {
21-
'client_id': '',
22-
'client_secret': '',
23-
'post_logout_redirect_uris': ['http://0.0.0.0:6969/logout']
24-
}
25-
```
26-
27-
#### Add S3 Config
28-
An S3 bucket is used to store files that users upload (currently just for major project submissions). In order to have this work properly, you need to provide some credentials to the app.
13+
It is likely easier to use containers like `podman` or `docker` or the corresponding compose file
2914

30-
There are 2 ways that you can get the needed credentials.
31-
1. Reach out to an RTP for creds to the dev bucket
32-
2. Create your own bucket using [DEaDASS](https://deadass.csh.rit.edu/), and the site will give you the credentials you need.
15+
With podman, I have been using
3316

34-
```py
35-
S3_URI = env.get("S3_URI", "https://s3.csh.rit.edu")
36-
S3_BUCKET_ID = env.get("S3_BUCKET_ID", "major-project-media")
37-
AWS_ACCESS_KEY_ID = env.get("AWS_ACCESS_KEY_ID", "")
38-
AWS_SECRET_ACCESS_KEY = env.get("AWS_SECRET_ACCESS_KEY", "")
17+
```sh
18+
podman compose up --watch
3919
```
4020

41-
#### Database
42-
You can either develop using the dev database, or use the local database provided in the docker compose file
43-
44-
Using the local database is detailed below, but both options will require the dev database password, so you will have to ask an RTP for this too
45-
46-
#### Forcing evals/rtp or anything else
47-
All of the role checking is done in `conditional/utils/user_dict.py`, and you can change the various functions to `return True` for debugging
21+
If you want, you can run without auto rebuild using
22+
```sh
23+
podman compose up --force-recreate --build
24+
```
25+
Which can be restarted every time changes are made.
4826

49-
### Run (Without Docker)
27+
## Run (Without Docker)
5028

5129
To run the application without using containers, you must have the latest version of [Python 3](https://www.python.org/downloads/) and [virtualenv](https://virtualenv.pypa.io/en/stable/installation/) installed. Once you have those installed, create a new virtualenv and install the Python dependencies:
5230

@@ -90,30 +68,53 @@ or
9068
python -m gunicorn
9169
```
9270

93-
### Run (containerized)
71+
## Config
9472

95-
It is likely easier to use containers like `podman` or `docker` or the corresponding compose file
96-
97-
With podman, I have been using
73+
You must create `config.py` in the top-level directory with the appropriate credentials for the application to run. See `config.env.py` for an example.
9874

99-
```sh
100-
podman compose up --watch
75+
### Add OIDC Config
76+
Reach out to an RTP to get OIDC credentials that will allow you to develop locally behind OIDC auth
77+
```py
78+
# OIDC Config
79+
OIDC_ISSUER = "https://sso.csh.rit.edu/auth/realms/csh"
80+
OIDC_CLIENT_CONFIG = {
81+
'client_id': '',
82+
'client_secret': '',
83+
'post_logout_redirect_uris': ['http://0.0.0.0:6969/logout']
84+
}
10185
```
10286

103-
If you want, you can run without compose support using
104-
```sh
105-
podman compose up --force-recreate --build
87+
### Add S3 Config
88+
An S3 bucket is used to store files that users upload (currently just for major project submissions). In order to have this work properly, you need to provide some credentials to the app.
89+
90+
There are 2 ways that you can get the needed credentials.
91+
1. Reach out to an RTP for creds to the dev bucket
92+
2. Create your own bucket using [DEaDASS](https://deadass.csh.rit.edu/), and the site will give you the credentials you need.
93+
94+
```py
95+
S3_URI = env.get("S3_URI", "https://s3.csh.rit.edu")
96+
S3_BUCKET_ID = env.get("S3_BUCKET_ID", "major-project-media")
97+
AWS_ACCESS_KEY_ID = env.get("AWS_ACCESS_KEY_ID", "")
98+
AWS_SECRET_ACCESS_KEY = env.get("AWS_SECRET_ACCESS_KEY", "")
10699
```
107100

108-
Which can be restarted every time changes are made
101+
### Database
102+
You can either develop using the dev database, or use the local database provided in the docker compose file
103+
104+
Using the local database is detailed below, but both options will require the dev database password, so you will have to ask an RTP for this too
105+
106+
### Forcing evals/rtp or anything else
107+
All of the role checking is done in `conditional/utils/user_dict.py`, and you can change the various functions to `return True` for debugging
108+
109+
109110

110-
### Dependencies
111+
## Dependencies
111112

112113
To add new dependencies, add them to `requirements.in` and then run `pip-compile requirements.in` to produce a new locked `requirements.txt`. Do not edit `requirements.txt` directly as it will be overwritten by future PRs.
113114

114-
### Database Stuff
115+
## Database Stuff
115116

116-
#### Local database
117+
### Local database
117118

118119
You can run the database locally using the docker compose
119120

@@ -130,7 +131,7 @@ To run migration commands in the local database, you can run the commands inside
130131
podman exec conditional flask db upgrade
131132
```
132133

133-
#### Database Migrations
134+
### Database Migrations
134135

135136
If the database schema is changed after initializing the database, you must migrate it to the new schema by running:
136137

conditional/blueprints/conditional.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -106,8 +106,8 @@ def conditional_review(user_dict=None):
106106

107107
if status == 'Passed':
108108
account = ldap_get_member(uid)
109-
hp = account.housingPoints
110-
ldap_set_housingpoints(account, hp + 2)
109+
hp = int(account.housingPoints)
110+
ldap_set_housingpoints(account, str(hp + 2))
111111

112112
elif cond_obj.i_evaluation:
113113
FreshmanEvalData.query.filter(FreshmanEvalData.id == cond_obj.i_evaluation).update(

conditional/blueprints/slideshow.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -138,8 +138,8 @@ def slideshow_spring_review(user_dict=None):
138138
if ldap_is_intromember(account):
139139
ldap_set_not_intro_member(account)
140140

141-
hp = account.housingPoints
142-
ldap_set_housingpoints(account, hp + 2)
141+
hp = int(account.housingPoints)
142+
ldap_set_housingpoints(account, str(hp + 2))
143143
elif status == "Failed":
144144
if ldap_is_intromember(account):
145145
ldap_set_failed(account)

conditional/util/ldap.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,13 +136,13 @@ def ldap_set_inactive(account):
136136

137137
def ldap_set_intro_member(account):
138138
_ldap_add_member_to_group(account, 'intromembers')
139-
ldap_get_intro_members().cache_clear()
139+
ldap_get_intro_members.cache_clear()
140140
ldap_get_member.cache_clear()
141141

142142

143143
def ldap_set_not_intro_member(account):
144144
_ldap_remove_member_from_group(account, 'intromembers')
145-
ldap_get_intro_members().cache_clear()
145+
ldap_get_intro_members.cache_clear()
146146
ldap_get_member.cache_clear()
147147

148148

docker-compose.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ services:
77
- conditional-postgres
88
ports:
99
- "127.0.0.1:8080:8080"
10+
env_file:
11+
- .env
1012
volumes:
1113
- ./migrations:/opt/conditional/migrations
1214
develop:

migrations/versions/e38beaf3e875_update_db.py

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -16,15 +16,6 @@
1616

1717
def upgrade():
1818
# ### commands auto generated by Alembic - please adjust! ###
19-
op.drop_index('member_batch_users_id_idx', table_name='member_batch_users')
20-
op.drop_table('member_batch_users')
21-
op.drop_index('freshman_batch_pulls_id_idx', table_name='freshman_batch_pulls')
22-
op.drop_table('freshman_batch_pulls')
23-
op.drop_index('member_batch_pulls_id_idx', table_name='member_batch_pulls')
24-
op.drop_table('member_batch_pulls')
25-
op.drop_index('freshman_batch_users_id_pkey', table_name='freshman_batch_users')
26-
op.drop_table('freshman_batch_users')
27-
op.drop_table('batch_conditions')
2819
op.alter_column('freshman_accounts', 'onfloor_status',
2920
existing_type=sa.BOOLEAN(),
3021
nullable=True)
@@ -37,7 +28,6 @@ def upgrade():
3728
op.alter_column('member_hm_attendance', 'attendance_status',
3829
existing_type=postgresql.ENUM('Attended', 'Excused', 'Absent', name='attendance_enum'),
3930
nullable=True)
40-
op.drop_table('batch')
4131
# ### end Alembic commands ###
4232

4333

0 commit comments

Comments
 (0)