PersoData Challenge with raw image

Organized by persodata - Current server time: Oct. 27, 2020, 11:17 a.m. UTC

Previous

Development Phase
Nov. 15, 2018, midnight UTC

Current

Final Phase
April 30, 2019, midnight UTC

End

Competition Ends
Never

Detect Fake Paintings

This challenge addresses the issues of detecting fake paintings. Recently, artificially created paintings has raised up a lot of issues as people wonder whether they could be considered ”art” and how harmful they could be to authentic art. While artificial images open ways for stunning and always more precised results, it is hard to bestow upon them any creative aspect since they all follow a certain algorithmicpattern (even if some artificial creations were estimated at about 10 000 $ ). Recent works show that it seems to become increasingly difficult for the naked eye to tell the difference between artificial paintings and real ones. However, as there are often more fake artistic objects circulating than authentic ones, this could pose a threat of art forgery and have negative impact on some traditional arts (e.g. aboriginal art where 80% of the products are inauthentic).

Nowadays, as computer graphics techniques for image generation are reaching stunning levels of quality, it becomes more and more challenging to detect fake from true, authentic images. This challenge uses paintings from WikiArt (the Visual Art Encyclopedia). Half of the images in this data set are fake paintings which were generated through Generative Adversarial Network. The task for this challenge is a binary classification whose goal is to detect the fake paintings.

In this challenge, we give participants the raw image as the data of the challenge, that's why we call it Persodata challenge with raw image

If you choose this challenge, you should try some methods to treat images.

Here is an exemple of some fake paintings (top) and real paintings (bottom) :

 

Using Docker

If you don't have your own computer, you can use our docker. If you have your own computer, you work on your computer.

Retrieve the Docker image:

docker pull mok0na/l2rpn:2.0

Retrieve the notebook README.ipynb to help you start this competition. It is available in the starting kit to download from the Files tab in the Partipate tab.

Download starting-kit and public data

unzip starting_kit -d starting-kit
cp -r starting-kit ~/aux

Run the jupyter notebook:

docker run --name persodata -it -p 5000:8888 -v ~/aux:/home/aux mok0na/l2rpn:2.0 jupyter notebook --ip 0.0.0.0 --notebook-dir=/home/aux --allow-root

With this commande, it wil give you a link to the notebook. Open the link and replace the port 5000 instead of 8888. e.g. : http://127.0.0.1:5000/?token=2b4e492be2f542b1ed5a645fa2cfbedcff9e67d50bb35380

If you use some libraries who don't exist in the docker, you can tape the commande : docker exec-it persodata bash and you will enter in the docker.

For exemple, for install the library tqdm, you tape the commande pip install tqdm in docker and this library will be installed.

To reuse the docker : restart the docker and open the link http://127.0.0.1:5000

docker start persodata

 

Steps to do the challenge 

  1. Download starting kit and the public data (you will get a folder whose name is inputdata_test and this is the data you will use for this challenge)
  2. In the folder starting kit,there is a sub folder called sample_data and we have placed almost 100 images in this folder just to let you try the readme.ipynb on this little dataset. This little dataset will facilate your comprehension of the challenge.
  3. Delete the folder sample data in the folder starting kit
  4. Change the name of the older inputdata_test to the name sample_data and place it in the folder starting kit
  5. Change the code in model.py to try to solve this challenge and use our readme.ipynb
  6. Make some submissions on the codalad

References and credits:
https://www.wikiart.org/.
Members of PersoData : Jiaxin Gao, Issa Hammoud, Valentin Carpentier, Hugues Ali Mehenni, Hugo Boulanger et Min Li
If you have any question for this challenge, you can contact us : persodata@chalearn.org
The competition protocol was designed by Isabelle Guyon.
The starting kit was adapted from an Jupyper notebook designed by Balazs Kegl for the RAMP platform.
This challenge was generated using Chalab, a competition wizard designed by Laurent Senta.

Evaluation

The problem is a binary classification problem. Each sample (image) is characterized by 64*64*3 features. You must predict whether the images are fake (0) or real (1).
You are given for training a data matrix X_train of dimension 65856 x 64*64*3 and an array y_train of labels of dimension 65856. You must train a model which predicts the labels for two test matrices X_valid and X_test.
There are 2 phases:

  • Phase 1: development phase. We provide you with labeled training data and unlabeled validation and test data. Make predictions for both datasets. However, you will receive feed-back on your performance on the validation set only. The performance of your LAST submission will be displayed on the leaderboard.
  • Phase 2: final phase. You do not need to do anything. Your last submission of phase 1 will be automatically forwarded. Your performance on the test set will appear on the leaderboard when the organizers finish checking the submissions.

This sample competition allows you to submit either:

  • Only prediction results (no code).

The submissions are evaluated using the AUC metric. To know more about our metric, you can consult https://scikit-learn.org/stable/modules/generated/sklearn.metrics.roc_auc_score.html

Rules

Submissions must be made before the end of phase 1. You may submit 5 submissions every day and 100 in total.

This challenge is governed by the general ChaLearn contest rules.

Development Phase

Start: Nov. 15, 2018, midnight

Description: Development phase: tune your models and submit prediction results, trained model, or untrained model.

Final Phase

Start: April 30, 2019, midnight

Description: Final phase (no submission, your last submission from the previous phase is automatically forwarded).

Competition Ends

Never

You must be logged in to participate in competitions.

Sign In
# Username Score
1 shangeth 0.9987
2 Zhen 0.9941
3 nGrin 0.9923