NSFW Classification Basics
How NSFW Classification works
Based off a large dataset (owo), CoffeeHouse can classify an image to determine if
it contains pornographic content or not, this can be used in your software
& services to filter out content that may potentially be inappropriate.
the prediction values of the NSFW Image classification model is pretty accurate
for general purposes. Currently the API only accepts
JPEG
and PNG
as acceptable image formats and in the future more
formats may be accepted.

API Method
Method Name | Access URI | Methods | Description |
---|---|---|---|
nsfw_classification | coffeehouse/v1/image/nsfw_classification | GET POST | Classifies the given image input |
API Authentication
CoffeeHouse's API is powered by Intellivoid API. This uses the standard "API Key"
authentication method which is the same across all other Intellivoid APIs.
Hint: Your Access Key is your API Key
The following authentication methods are available
Basic HTTP Authentication
You can authenticate by providing your access key via the password field, the username can be left blank and you should not provide your access key via a username.GET/POST Parameter
You can provide your access key via a GET Parameter or within a POST (multipart/form-data), the parameter name that is applicable to both methods isaccess_key
, for example.
Security Notice
All official APIs by Intellivoid are done throughapi.intellivoid.net
,
followed by the service name and version of the API Handler. For example
{
"success": false,
"response_code": 401,
"error": {
"error_code": 0,
"type": "CLIENT",
"message":
"Unauthorized Access, Authentication is required"
}
}
Additionally
WWW-Authenticate: Basic realm="API Authentication"
will
be provided in the response headers, if you are using a web browser you will be
prompted to authenticate, you can authenticate by providing your Access Key in
the password field as explained above.
Troubleshooting
All API responses returns a
X-Request-ID
header with a unique
value set with every response you get. It is important to log this request ID
if you encounter unexpected issues and server-side errors. Intellivoid can use
this ID to get more details about the request and troubleshoot the error.
Data protection
To protect your data, we do not store the information our server returns to your client, the only data we keep track of is the request data you sent and server-side details that are not visible in the request such as exception dumps, cache information and so on. This information is automatically deleted from our server after two weeks. This data is used to troubleshoot any problems with our services and to address them accordingly.Classifying an image
There are two ways to upload an image to classify, either by the traditional method of uploading a file via theimage
parameter or base64 encoding the
image contents yourself and passing the content via a image
parameter.
This method will return a response with the NsfwClassification object and a optional generalization object upon success
Parameter Name | Default Value | Required | Description |
image | NULL | True | Bas64 Encoded contents of the image / form-data file post |
Example Success Response
{
"success": true,
"response_code": 200,
"results": {
"nsfw_classification": {
"content_hash": "4816a1048c3c1669fdded70efb713981634eff8e42c57caeaf4b145cb09441e8",
"content_type": "png",
"safe_prediction": 99.425405,
"unsafe_prediction": 0.57458887,
"is_nsfw": false
},
"generalization": null
}
}
Response Object Structure
Name | Type | Description |
nsfw_classification | NsfwClassification | The results of the NSFW classification process |
generalization | Generalization|null | The generalization results if Generalization is enabled, otherwise it will be returned as null |
NsfwClassification Object Structure
Name | Type | Description |
content_hash | string | a SHA256 hash representation of the image |
content_type | string | The content type detected by the API, either PNG or JPG |
safe_prediction | float | The safe prediction of the image content |
unsafe_prediction | float | The unsafe prediction of the image content |
is_nsfw | bool | Indicates if the image is NSFW (Unsafe) by comparing if unsafe_prediction is a higher value than safe_prediction |
Invalid File Type Response
This response is given when the uploaded file is not supported by the API
{
"success": false,
"response_code": 400,
"error": {
"error_code": 12,
"type": "CLIENT",
"message": "The file type isn't supported for this method"
}
}
Generalization
This method supports the generalization of it's results, for more information on how generalization works, see Generalization - Introduction, this method will use the following labels for generalization-
safe
-
unsafe