Spaces:
Runtime error
Runtime error
Kwadwo Agyapon-Ntra
commited on
Commit
Β·
fcae6d4
1
Parent(s):
28ff501
Modified description
Browse files- app.py +1 -0
- test.ipynb +1 -0
app.py
CHANGED
@@ -56,6 +56,7 @@ description = "**The internet is not safe for children**. Even if we know the 'b
|
|
56 |
"This is step one in an attempt to solve that. An image classifier that audits every image at a URL. \n"+\
|
57 |
"In this iteration, I classify sites with sexually explicit content as **'NOT safe'**. \n\n"+\
|
58 |
"There is a long way to go with NLP for profanity, cyber-bullying, as well as CV for violence, substance abuse, etc. \n"+\
|
|
|
59 |
"I welcome any help on this. π"
|
60 |
examples = ['porhub.com', 'cnn.com', 'xvideos.com', 'www.pinterest.com']
|
61 |
enable_queue=True
|
|
|
56 |
"This is step one in an attempt to solve that. An image classifier that audits every image at a URL. \n"+\
|
57 |
"In this iteration, I classify sites with sexually explicit content as **'NOT safe'**. \n\n"+\
|
58 |
"There is a long way to go with NLP for profanity, cyber-bullying, as well as CV for violence, substance abuse, etc. \n"+\
|
59 |
+
"Another step will be to convert this into a browser extension/add-on. \n"+\
|
60 |
"I welcome any help on this. π"
|
61 |
examples = ['porhub.com', 'cnn.com', 'xvideos.com', 'www.pinterest.com']
|
62 |
enable_queue=True
|
test.ipynb
CHANGED
@@ -110,6 +110,7 @@
|
|
110 |
" \"This is step one in an attempt to solve that. An image classifier that audits every image at a URL. \\n\"+\\\n",
|
111 |
" \"In this iteration, I classify sites with sexually explicit content as **'NOT safe'**. \\n\\n\"+\\\n",
|
112 |
" \"There is a long way to go with NLP for profanity, cyber-bullying, as well as CV for violence, substance abuse, etc. \\n\"+\\\n",
|
|
|
113 |
" \"I welcome any help on this. π\"\n",
|
114 |
"examples = ['porhub.com', 'cnn.com', 'xvideos.com', 'www.pinterest.com']\n",
|
115 |
"enable_queue=True\n",
|
|
|
110 |
" \"This is step one in an attempt to solve that. An image classifier that audits every image at a URL. \\n\"+\\\n",
|
111 |
" \"In this iteration, I classify sites with sexually explicit content as **'NOT safe'**. \\n\\n\"+\\\n",
|
112 |
" \"There is a long way to go with NLP for profanity, cyber-bullying, as well as CV for violence, substance abuse, etc. \\n\"+\\\n",
|
113 |
+
" \"Another step will be to convert this into a browser extension/add-on. \\n\"+\\\n",
|
114 |
" \"I welcome any help on this. π\"\n",
|
115 |
"examples = ['porhub.com', 'cnn.com', 'xvideos.com', 'www.pinterest.com']\n",
|
116 |
"enable_queue=True\n",
|