Skip to content
This repository has been archived by the owner on Feb 6, 2021. It is now read-only.

Code Compilation Verification #23

Open
janaSunrise opened this issue Jun 11, 2020 · 5 comments
Open

Code Compilation Verification #23

janaSunrise opened this issue Jun 11, 2020 · 5 comments
Assignees
Labels
help wanted Extra attention is needed priority: 1 - high High priority status: planning Discussing details type: feature New feature or request

Comments

@janaSunrise
Copy link
Contributor

We need to restrict the eval command in the bot to those who are verified, but to implement that, we must add a verification system, let's take the example of altdentifier verification and like that maybe and verify, currently it's restricted to owners only.

@janaSunrise janaSunrise added help wanted Extra attention is needed priority: 1 - high High priority type: feature New feature or request status: planning Discussing details labels Jun 11, 2020
@janaSunrise janaSunrise self-assigned this Jun 11, 2020
@ItsDrike
Copy link
Contributor

It is still pretty dangerous to grant access to eval to anyone outside trusted users (staff members, etc..). We need to make sure that eval is running in a safe environment to prevent things like:

Infinite loops

while True:
    print("hi")

These could run forever and take up computing power
This could probably be solved with a time limit on code execution

Unsafe file manipulation

with open("main.py", "w") as f:
    f.write("")

This would effectively rewrite our main.py file to ""

import os
os.remove("main.py")

This would delete main.py file completely.

@ghost
Copy link

ghost commented Jun 23, 2020

Hello, I have some question about the actual implementation of the verification system, did you make any updates on it? Or it's still in the backlog?

I don't mind writing it and implementing it.

About the eval command, the first thing going out of my head is running the eval on a docker that you'll start/stop on demand, shouldn't be that demanding performance wise (alpine + python for example)

@ItsDrike
Copy link
Contributor

@sonipn the verification wasn't implemented yet and it is not currently being worked on, you can take it up if you feel like doing it.

About the eval, even though it will only run in docker, I'm still not a huge fan of eval as it can still be quite dangerous, there might be an exploit to docker itself which with this kind of execution would be a huge problem, I don't think it's worth the risk until we know for sure that the input to eval is fully sanitized. What we could do is use some external API to compile the code in eval for us instead.

@ActuallyDaneel
Copy link
Contributor

ActuallyDaneel commented Jul 19, 2020

Not sure whether this is applicable here, but the python discord bot uses this custom sandbox (here) to run their eval command. It'll probably be difficult to implement but perhaps it has some pointers you could pick up.

@ItsDrike
Copy link
Contributor

@therealdaneel yes, the something like python-discord's snekbox would be an option but it will take a while to develop something like that

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
help wanted Extra attention is needed priority: 1 - high High priority status: planning Discussing details type: feature New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants