Puff is an open source smoke testing platform for students to collaboratively write and run tests on their assignment or project code for quick and easy sanity testing.
Spring-boot microservice managing user code uploads and running test cases for the puff platform. For the full overview of the puff project see the docs repository.
Clone the project with https://github.com/puffproject/test-runner.git
You'll need Java to run Puff's microservices developed with Spring Boot.
- Download and install the Java JDK 8
- Set the
JAVA_HOME
environment variable - Verify your installation by running
java -version
Puff uses Maven as its build tool for its backend.
Keycloak is an open source Identity and Access Management solution aimed at modern applications and services. It makes it easy to secure applications and services with little to no code.
Puff uses keycloak as a user management and authentication solution. More information about Keycloak can be found on their offical docs page.
Follow the instructions found at https://github.com/puffproject/docs#setup-keycloak to setup a local keycloak server. This only needs to be configured once.
Once configured, generate an authentication token by making the following curl call replacing TEST_USER_USERNAME, TEST_USER_PASSWORD and USER_AUTH_CLIENT_SECRET with the credentials for the test accounts you created and the client-secret for user-auth.
curl -X POST 'http://localhost:8180/auth/realms/puff/protocol/openid-connect/token' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'grant_type=password' \
--data-urlencode 'client_id=user-auth' \
--data-urlencode 'client_secret=USER_AUTH_CLIENT_SECRET' \
--data-urlencode 'username=TEST_USER_USERNAME' \
--data-urlencode 'password=TEST_USER_PASSWORD'
Puff uses Docker containers to isolate and run test cases with user code in a secure environment, before returning their result.
- Install Docker from their Getting started page.
- Build the following base images so Docker can cache some of the layers used by the processes and exponentially speed up image build times.
Python3
cd src/main/resources/runner/docker/base && docker build -t pf_python3:base -f Dockerfile_py .
Once the images are built go back to the root of the repository for the rest of the steps.
- Change the
runner.dir
value insrc/main/resources/application-local.yml
to point to a directory where Puff can copy code files and run containers from.
In order to run a microservice locally run from the root directory:
mvn spring-boot:run -Dspring-boot.run.profiles=local
- If you need to build the
.jar
of the application runmvn package
. - In order to run tests run
mvn test
.
See the below commands for running tests:
- run all tests:
mvn test
- run specific tests in a class:
mvn test -Dtest=TestClassName
- run specific method in a class:
mvn test -Dtest=TestClassName#TestMethodName
- run tests and generate a coverage report:
mvn test -P coverage
Puff's Spring-Boot backend exposes a REST API. The project utilizes Swagger to document and keep a consistent REST interface.
Once you have a microservice running (See run the backend) visit http://localhost:8083/swagger-ui.html. A json
api version to be consumed and used to generate client libraries can be accessed at http://localhost:8083/v2/api-docs.
Select Authorize
and login with a test user account to try out any of the endpoints.
Puff's Spring-boot backend uses a H2 runtime database to simulate a database connection for local development. Once the project is running it can be accessed at http://localhost:8083/h2.
The credentials for the database are as follows:
Driver Class: org.h2.Driver
JDBC URL: jdbc:h2:mem:testdb
User Name: admin
Password:
For more information about H2 databases see the H2 Database Engine.
The codebase is auto-formatted with the formatter-maven-plugin that will format all source code files in the src/
and test/
directories according to the settings in the style.xml file, which are based on eclipse profile settings.
Run the mvn formatter:format
command to run the formatter. It is also bound to the format
goal that will run as part of the compile
phase.
You can also add the git pre-commit hook to your local .git/hooks
folder to run the formatter on pre-commit.
Puff runs each test case in isolation with a multi-layered, zero-trust approach to ensure security and speed when dealing with user submitted code. It accomplishes this with the following steps:
- Building an image of the system on which to run the test case. The command is as follows:
docker build --build-arg DIR=/code -t image_name:tag .
The image is built on a base image for the programming language required. A new user appuser
is provisioned and added to proper groups. All user code files required are copied from the working directory into the image. An entrypoint that will run the shell script containing the test command is configured.
The following controls are applied:
- Docker image build timeout with Java thread timeout
- Running the constructed image as a container in which a shell script executing the test command is run. The command is as follows:
docker run --rm -m 450M --name my_container --env-file .env -e OTHER_ENV_VALUE=XX -v path/to/test/file:/code/file:ro image_name
The container is run as an executable with an environment supplied through an environment file. The container's entrypoint is the shell script containing the test command to run the test case. The test suite file containing the test cases to run is supplied through a read-only volume which is modified by Puff for each test case.
The following controls are applied:
- Docker memory limit with the
-m
flag - Virtual memory limit with
ulimit -v
- Stack memory limit with
ulimit -s
- Max user processes with
ulimit -u
- Max file size written with
ulimit -f
- Max number of file descriptors open with
ulimit -n
- Test command timeout with
timeout
- Docker container timeout with Java thread timeout
Most of the above values are configurable from the environment file or the profile.
Puff supports projects written in python 3 and uses pytest for running test cases. After defining a test suite set the suite file following the below template:
from pytest import * # Python import REQUIRED
# ... import all source modules and function
# ... import any other helper libraries
## Define a test class (or don't and use simple test functions)
## Name MUST start with Test* for it to be recognized
class TestFunc:
def setup_method(self, method):
# Setup any values test cases should have access to
# e.g. self.x = 2
pass
def teardown_method(self, method):
# Teardown any values setup
pass
## There are other methods that pytest can take advantage of
# For more information see https://docs.pytest.org/en/6.2.x/getting-started.html#create-your-first-test
## All other test cases defined with be appended here
# ...
Users will then upload test cases as part of the suite. Once users upload source files they can run those test cases using your suite file against their uploaded source code.
For a full demo see the sample walkthrough below.
Haskell testing uses HUnit to run tests.
The same procedure followed for Python 3 can be followed here with some notable differences
The differences are:
- The uploaded Suite file must import the main program file.
- A
main
function is automatically created, so the uploaded file must not contain one. - The file automatically fully imports and exposes
HUnit
andSystem.Exit
packages, so avoid any conflicting symbols and names. - The generated file should like something like this:
{-
The contents for the uploaded suite file
This must include the module declaration.
Must import the program file, the main submission.
-}
import Test.HUnit
import System.Exit
main :: IO ()
main = do
executedTest <- runTestTT test_{func_name}
print executedTest
if errors executedTest > 0
then do
print $ "Test ran with " ++ show (errors executedTest) ++ " errors and " ++ show (failures executedTest) ++ " failures."
exitWith $ ExitFailure 2
else if failures executedTest > 0
then do
print $ "Test ran with " ++ show (failures executedTest) ++ " failures."
exitWith $ ExitFailure 1
else print "Test(s) ran with no problems!"
test_{func_name} :: Test
test_{func_name} = {func_body}
Yo can also check out src/main/resources/runner/docker/suite-template
for the suite template.
A sample test suite file called Test.hs
which is testing the submission file called Assignment1.hs
.
Note: The submission file must have module Assignment1 where
at the top, assuming the file is called Assignment1.hs
.
module Test where
import Assignment1
- The test file name must match the module name. If you named the suite file
TestSuite.hs
the module declaration must readmodule TestSuite where
. - The suite file name must be a valid module name.
If you've followed the above setup steps then you should be able to follow the below walkthrough. This example uses test cases written in python.
Start up the microservice, see run the backend. Authenticate as a user by making the curl call described in setup keycloak. Copy your token and set it as an environment variable:
TOKEN=YOUR_TOKEN_HERE
To pretty format the curl requests I'm using python's json tooling. If you don't have python remove the pipe at the end of the curl call.
You then need to create a test suite that will contain test cases. Make the following curl call:
curl --header "Content-Type: application/json" \
--header "Authorization: Bearer ${TOKEN}" \
--request POST \
--data '{"assignmentId": 1000, "language": "PYTHON3", "name": "Tests my function"}' \
http://localhost:8083/suite | python -m json.tool
Make a note of the id returned and set it as a variable SUITE_ID=...
. See the list of test suites by running
curl --header "Content-Type: application/json" \
--header "Authorization: Bearer ${TOKEN}" \
--request GET \
http://localhost:8083/suite | python -m json.tool
Next, set the test suite's base file, all test case code will be appended to this file so it needs to be setup correctly. Create a file test_func.py
with the following contents:
from pytest import *
from Func import *
## Test class for some function
class TestFunc:
def setup_method(self, method):
self.x = 2
def teardown_method(self, method):
pass
Run the follow curl command to set the file for the test suite.
curl --header "Content-Type: multipart/form-data" \
--header "Authorization: Bearer ${TOKEN}" \
--request POST \
-F file=@test_func.py \
http://localhost:8083/suite/${SUITE_ID}/setFile | python -m json.tool
Next, create a Func.py
file with a single function:
def getX():
return 2
Upload your Func.py
file as the "source code" for your project you want to test:
curl --header "Content-Type: multipart/form-data" \
--header "Authorization: Bearer ${TOKEN}" \
--request POST \
-F [email protected] \
-F assignmentId=1000 \
http://localhost:8083/upload | python -m json.tool
Next upload the code for a test case you want to run. We'll check that the value returned from getX()
in Func.py
is equal to 2. Our test case will look like
def test_isTwo(self):
assert self.x == getX()
Create the test case in the test suite with the following call:
curl --header "Content-Type: application/json" \
--header "Authorization: Bearer ${TOKEN}" \
--request POST \
--data "{\"description\": \"Tests if value of x is 2\", \"body\": \"assert self.x == getX()\", \"suiteId\": ${SUITE_ID}, \"language\": \"PYTHON3\", \"functionName\": \"isTwo\"}" \
http://localhost:8083/case | python -m json.tool
Make a note of the id of the case and set it as a variable CASE_ID=...
.
Lastly, run the test case, specifying the id of the test case to run:
curl --header "Content-Type: application/json" \
--header "Authorization: Bearer ${TOKEN}" \
--request POST \
--no-buffer \
http://localhost:8083/suite/${SUITE_ID}/run?ids=${CASE_ID} | python -m json.tool
You should receive a json response matching something like the following:
{
"caseId": {CASE_ID},
"status": "PASS",
"message": ...
}
The Puff project is looking for contributors to join the initiative! For information about progress, features under construction and opportunities to contribute see our project board.
If you're interested in helping please read our CONTRIBUTING.md for details like our Code of Conduct and contact Benjamin Kostiuk for more information.