Change sensitivity of screenshots classified as "concerning"

  • 1
  • Problem
  • Updated 7 months ago
Dear Support,
I am trying to set up and use Covenant Eyes on a computer and iOS device. Having used it for a day, I am noticing two serious problems.

First, it seems as though the Screen Accountability feature does have different sensitivity levels. I presume it is, by default, set to only look out for nudity and things like that, but what if I want it to report on pictures of women in bikinis being viewed? Luckily, there was one such photo caught by the software, but it was only be accident and was filed in the "Other Screenshots" category. There really should be a way to work around this.

Second, there should at least be a way to also have it report Google searches that are also rated according to their maturity level. Again, the above image was viewed after a search was conducted using the word "nudity." However, the program did not report on this and it was only by accident that I could view it since, again, it just happened to be one of the screen shots placed in the "Other Screenshots" category. Under the description, it showed the text of the Google search with that word, but did not flag anything.
From what I understand, there was a feature in Internet Accountability that addressed the second of these problems. Unfortunately, I have one of the new accounts which does not seem to support this feature set. Are there any solutions to one or both of these problems?

Thank you,
Photo of Jesse


  • 1 Post
  • 0 Reply Likes
  • A little disappointed.

Posted 7 months ago

  • 1
Photo of Robert B

Robert B, Official Rep

  • 503 Posts
  • 47 Reply Likes

Thanks for asking!

Let me give an over-simplified technical explanation. All images receive a score. The score is what classifies the image as "concerning" or "not concerning." In this case, it sounds like the current program did exactly what we would expect it to do. The program determined that the image was not a concerning screenshot, but realized that it had a higher score than the other non-concerning images. Therefore, it needed to be included in the Other Screenshots section. That is the current solution to the first problem you mentioned.

Others have mentioned the idea of detecting material that, while not pornographic, is troublesome. We get it. But, as I'm sure you can understand, natural constraints forced us to focus our development of the program. If/when we take that next step of reporting suggestive/troublesome/non-pornographic material, it'll take work by our Development team and our Reports team. And then our Marketing team will get the word out.

The second item you mentioned is similar, in a way, to the first item. A version of that idea did exist before Screen AccountabilityTM came into being. Interestingly, there hasn't been much feedback about reporting Google search terms. I have conveyed both of your requests because all feedback is important. Our decision makers are paying attention to member feedback.

Thanks for posting,