US20100153381A1 - Automatic Mail Rejection Feature - Google Patents

Automatic Mail Rejection Feature Download PDF

Info

Publication number
US20100153381A1
US20100153381A1 US12/708,681 US70868110A US2010153381A1 US 20100153381 A1 US20100153381 A1 US 20100153381A1 US 70868110 A US70868110 A US 70868110A US 2010153381 A1 US2010153381 A1 US 2010153381A1
Authority
US
United States
Prior art keywords
spam
domain
message
emails
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/708,681
Inventor
Scott C. Harris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harris Technology LLC
Original Assignee
Harris Technology LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harris Technology LLC filed Critical Harris Technology LLC
Priority to US12/708,681 priority Critical patent/US20100153381A1/en
Publication of US20100153381A1 publication Critical patent/US20100153381A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/107Computer-aided management of electronic mailing [e-mailing]

Definitions

  • This invention relates to an automatic mail rejection feature in an e-mail program.
  • E-mail can be an inexpensive and effective way of sending information. Because of this, a recurrent problem is “spam”, or the sending of unwanted email to a certain person. Once an e-mail address gets on a spammer's list, the person can be barraged with junk email. Various attempts have been made to combat this problem.
  • some web e-mail programs include the ability to block further mail from a specified sender.
  • junk mail is received from a specified address, the control is actuated. Further mail from that specified sender is then blocked, presumably automatically deleted or sent to the trash.
  • Some e-mail programs allow a user to manually set criteria for rejection of incoming email. For example, if an incoming e-mail is from a domain that has many known spammers, many people may simply set their program to delete it. However, this has the unintended extra effect of also removing desired email, at times.
  • the automatic rejection feature does nothing to resolve the traffic caused by junk e-mail.
  • the present application teaches an automatic system which automatically recognizes certain aspects of undesired messages such as junk email and undesired Internet content.
  • the system automatically produces recommendations of criteria to use in automatically removing undesired information.
  • these criteria can be automatically enforced or can be presented to the user as a table of options.
  • the system can look for keywords in the e-mail, and can automatically postulate strategies for rules based on these keywords.
  • FIG. 1 shows an email browser window
  • FIG. 2 shows a determined spam message, and the parsing scheme used on it
  • FIG. 3 shows an exemplary computer system
  • FIG. 4 shows and operational flowchart.
  • a first embodiment describes an e-mail program which allows automatic rejection of unwanted messages.
  • the embodiment runs on a computer shown in FIG. 3 , having a processor 300 and memory 305 .
  • a typical e-mail browser window is shown in FIG. 1 .
  • the browser window include a number of operating buttons 102 , a list of return addresses, and message subject.
  • This browser also includes and displays a measure of likelihood of spam quotient or “LOSQ”.
  • the likelihood of spam quotient is displayed in the rightmost column as a percentage. For example, a message that is known to be spam would have a likelihood of spam quotient of 100%. Other messages that are less likely to be spam may have a likelihood of spam quotient of something less than 100%.
  • the likelihood of spam quotient can be displayed as a number as shown in FIG. 1 , or alternately can be displayed by the color of the message being displayed.
  • the message can be displayed in green to indicate low likelihood of spam (e.g. less than 10%) and yellow to indicate medium likelihood of spam (e.g. between 10 and 80 percent, and in red to indicate high likelihood of spam; for example likelihood of 80 to 100 percent to be spam, for example.
  • buttons 106 on the toolbar requests removal of the high spam likelihood messages from the inbox. This enables, in a single click, removing all high likelihood of spam messages.
  • Another button 120 is an options button which brings up the options menu of FIG. 2 .
  • the function buttons in FIG. 1 include, as conventional, a delete message button 107 .
  • An additional a “delete as spam” button 111 is also provided. Any message that is deleted as being spam is further processed to determine characteristics that can be used to process other messages. Characteristics of the deleted-as-spam message are used to update the rules database to indicate characteristics of the spamming messages.
  • Another button 112 is also provided indicating “delete the message; not spam”. Therefore, the user is presented with three different options: delete the message without indicating whether it is spam or not, delete the message while indicating that it is spam, or delete the message indicating that it is not spam.
  • this option allows adding an incoming e-mail message to the spam list, when it is determined to be likely to be spam.
  • FIG. 1 also shows a number of different ways of displaying different email.
  • the first option labeled “show all messages”, on button 104 , has the function, as it suggests, of showing all messages.
  • the messages may be further characterized based on the likelihood that they are spam. As described herein, the messages are characterized by comparing them with rules. Each match with the rules may increase the score, and make it more likely that the message is spam. More about this operation is described herein.
  • Those messages which are likely not spam are shown in a neutral color such as green or black.
  • the messages which are questionable are shown in a cautionary color, such as yellow highlight.
  • the messages which are likely to be spam are shown in an alert color such as red.
  • a second display option displays only those messages which are likely to represent desired messages. Hence, only the green and yellow messages are displayed.
  • the messages are sorted by date and time received. Within each day, the messages are sorted by likelihood of being spam.
  • the spam-likely messages, which are determined to be likely to represent spam may be put into a separate folder; here shown as “spam-likely messages”.
  • the messages which are likely to represent undesired information can be read by the user. If not read by the user, they are kept in the folder for a specified period of time e.g. thirty days, before deleting.
  • the incoming messages are processed based on rules. For example, if one does not want to be on a mailing list about XXX type items, then messages that include the text “free xxx pictures” may be likely to be spam. However, other people may find those messages to be highly desirable. Similarly, messages about get rich quick schemes may be trash to one person, treasure to another.
  • the present system allows customization of which emails to remove as spam, by defining rules. Each time a message is deleted as spam, a number of aspects about that message are stored. A database is used to store the message. This database may include relative weighting of different aspects.
  • FIG. 2 shows a determined spam message, and the parsing scheme.
  • the sender of the message is often a highly determinative factor. For example, if a specific sender sends one spam message, the same sender is very likely to be sending another spam message later on. Therefore, a first item in the database is the “received from” field 202 . In addition to the specific sender, however, the domain of the sender often gives information. This domain is reviewed at 204 . If the domain is a common domain such as Yahoo.com or Hotmail.com, then the relevance of the sender's domain may not be probative. If, however, the domain name is uncommon, such getrichquick.com or the like, then it is more likely that other message from that domain would be spam. Further, many messages from a common domain may itself be probative. The domain information is weighted accordingly.
  • the domain name from an item is added to the rules database from field 204 .
  • Another field 206 stores an indication of whether the domain is a common domain or an uncommon/specific domain. This determination is initially zero, and is changed depending on the number of hits of domains that become present in the database. For example, when two different addresses from the same domain become spam, then the value becomes presumptly H (likely to be spam). When two different addresses from the same domain are received, one spam, the other not, then the value presumptively becomes L.
  • Each sentence and field in the e-mail, including subject; text of the body; links in the email, and any others is then stored as a separate field.
  • Analogous information may also be categorized from emails that are deleted as “not spam”. This provides a database of characteristics that are likely to represent spam messages, and other characteristics that are less likely to represent spam messages. Matching with the databases changes the scoring of the message accordingly.
  • Indexing approaches can be used to increase the speed of the comparison.
  • the detailed comparison may also be done in the background; the message may be displayed, and its classification displayed only some time later.
  • FIG. 4 shows incoming messages received at 400 being broken down into analogous parts to those parts that are cataloged in the database 410 .
  • Each part in the incoming e-mail is compared with each part in the database.
  • a simplified index can be prepared, such as the type used for internet searching, in order to speed up the searching.
  • Each match changes the scoring of the email, to make it more likely to be spam, or less likely to be spam at 415 .
  • Each field match has a specified score increase. For example, match from the addressee is a very powerful indication of spam, and may by itself carry a score of 75. 100% matching of a sentence may carry a score of 10. A 50% word match may carry a score of 3. Match of the hyperlinks in an e-mail to those in a previously spammed determined e-mail may carry a score of 5.
  • the e-mail and its fields can be compared with non-spam indicative email.
  • An e-mail which is not spam can carry negative scores, for example.
  • Finding the e-mail address to be on the non-spam list for example, can carry a score of negative 100, or can immediately abort the process with an indication of non-spam.
  • a message may be characterized as unknown or cautionary (yellow).
  • mixed signals may result in an unknown result.
  • the total score for an e-mail is assessed, and this total score is used to assess if the e-mail is spam or not. If the e-mail is determined to be spam, then it is appropriately processed.

Abstract

A spam defining system defines rules about emails depending on user's reactions to emails. A user can delete an email as spam, or not spam, or without committing to whether the email is spam or not. If the user indicates whether the email is spam or not spam; characteristics of the email are used to update a database. Incoming emails are compared against the database, to determine a likelihood of whether they are spam.

Description

  • This application is a divisional of Ser. No. 09/690,002, filed Oct. 16, 2000; which claims priority from provisional application No. 60/203,729, filed May 12, 2000, now lapsed.
  • BACKGROUND
  • This invention relates to an automatic mail rejection feature in an e-mail program.
  • E-mail can be an inexpensive and effective way of sending information. Because of this, a recurrent problem is “spam”, or the sending of unwanted email to a certain person. Once an e-mail address gets on a spammer's list, the person can be barraged with junk email. Various attempts have been made to combat this problem.
  • For example, some web e-mail programs include the ability to block further mail from a specified sender. When junk mail is received from a specified address, the control is actuated. Further mail from that specified sender is then blocked, presumably automatically deleted or sent to the trash.
  • Certain laws also cover spamming, and require that each e-mail that is sent unsolicited have a way of unsubscribing from the list. Spammers combat both of these measures by continually changing their name and/or changing their return address.
  • Some e-mail programs allow a user to manually set criteria for rejection of incoming email. For example, if an incoming e-mail is from a domain that has many known spammers, many people may simply set their program to delete it. However, this has the unintended extra effect of also removing desired email, at times.
  • In addition, the automatic rejection feature does nothing to resolve the traffic caused by junk e-mail.
  • SUMMARY
  • The present application teaches an automatic system which automatically recognizes certain aspects of undesired messages such as junk email and undesired Internet content. The system automatically produces recommendations of criteria to use in automatically removing undesired information.
  • In an email embodiment described herein, these criteria can be automatically enforced or can be presented to the user as a table of options. In addition, the system can look for keywords in the e-mail, and can automatically postulate strategies for rules based on these keywords.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other aspects will be described in detail with reference to the accompanying drawings, wherein:
  • FIG. 1 shows an email browser window;
  • FIG. 2 shows a determined spam message, and the parsing scheme used on it;
  • FIG. 3 shows an exemplary computer system; and
  • FIG. 4 shows and operational flowchart.
  • DESCRIPTION OF THE PREFERRED EMBODIMENT
  • A first embodiment describes an e-mail program which allows automatic rejection of unwanted messages. The embodiment runs on a computer shown in FIG. 3, having a processor 300 and memory 305. A typical e-mail browser window is shown in FIG. 1. The browser window include a number of operating buttons 102, a list of return addresses, and message subject. This browser also includes and displays a measure of likelihood of spam quotient or “LOSQ”. The likelihood of spam quotient is displayed in the rightmost column as a percentage. For example, a message that is known to be spam would have a likelihood of spam quotient of 100%. Other messages that are less likely to be spam may have a likelihood of spam quotient of something less than 100%.
  • The likelihood of spam quotient can be displayed as a number as shown in FIG. 1, or alternately can be displayed by the color of the message being displayed. For example, the message can be displayed in green to indicate low likelihood of spam (e.g. less than 10%) and yellow to indicate medium likelihood of spam (e.g. between 10 and 80 percent, and in red to indicate high likelihood of spam; for example likelihood of 80 to 100 percent to be spam, for example.
  • One of the buttons 106 on the toolbar requests removal of the high spam likelihood messages from the inbox. This enables, in a single click, removing all high likelihood of spam messages. Another button 120 is an options button which brings up the options menu of FIG. 2.
  • The function buttons in FIG. 1 include, as conventional, a delete message button 107. An additional a “delete as spam” button 111 is also provided. Any message that is deleted as being spam is further processed to determine characteristics that can be used to process other messages. Characteristics of the deleted-as-spam message are used to update the rules database to indicate characteristics of the spamming messages.
  • Another button 112 is also provided indicating “delete the message; not spam”. Therefore, the user is presented with three different options: delete the message without indicating whether it is spam or not, delete the message while indicating that it is spam, or delete the message indicating that it is not spam.
  • The latter two options are used to update the rules in the rules database as described in further detail herein. Hence, this option allows adding an incoming e-mail message to the spam list, when it is determined to be likely to be spam.
  • FIG. 1 also shows a number of different ways of displaying different email. The first option, labeled “show all messages”, on button 104, has the function, as it suggests, of showing all messages. The messages may be further characterized based on the likelihood that they are spam. As described herein, the messages are characterized by comparing them with rules. Each match with the rules may increase the score, and make it more likely that the message is spam. More about this operation is described herein.
  • Those messages which are likely not spam are shown in a neutral color such as green or black. The messages which are questionable are shown in a cautionary color, such as yellow highlight. Finally, the messages which are likely to be spam are shown in an alert color such as red.
  • A second display option displays only those messages which are likely to represent desired messages. Hence, only the green and yellow messages are displayed. According to one embodiment, the messages are sorted by date and time received. Within each day, the messages are sorted by likelihood of being spam. The spam-likely messages, which are determined to be likely to represent spam, may be put into a separate folder; here shown as “spam-likely messages”.
  • The messages which are likely to represent undesired information can be read by the user. If not read by the user, they are kept in the folder for a specified period of time e.g. thirty days, before deleting.
  • The incoming messages are processed based on rules. For example, if one does not want to be on a mailing list about XXX type items, then messages that include the text “free xxx pictures” may be likely to be spam. However, other people may find those messages to be highly desirable. Similarly, messages about get rich quick schemes may be trash to one person, treasure to another.
  • The present system allows customization of which emails to remove as spam, by defining rules. Each time a message is deleted as spam, a number of aspects about that message are stored. A database is used to store the message. This database may include relative weighting of different aspects. FIG. 2 shows a determined spam message, and the parsing scheme.
  • The sender of the message is often a highly determinative factor. For example, if a specific sender sends one spam message, the same sender is very likely to be sending another spam message later on. Therefore, a first item in the database is the “received from” field 202. In addition to the specific sender, however, the domain of the sender often gives information. This domain is reviewed at 204. If the domain is a common domain such as Yahoo.com or Hotmail.com, then the relevance of the sender's domain may not be probative. If, however, the domain name is uncommon, such getrichquick.com or the like, then it is more likely that other message from that domain would be spam. Further, many messages from a common domain may itself be probative. The domain information is weighted accordingly.
  • The domain name from an item is added to the rules database from field 204. Another field 206 stores an indication of whether the domain is a common domain or an uncommon/specific domain. This determination is initially zero, and is changed depending on the number of hits of domains that become present in the database. For example, when two different addresses from the same domain become spam, then the value becomes presumptly H (likely to be spam). When two different addresses from the same domain are received, one spam, the other not, then the value presumptively becomes L.
  • Each sentence and field in the e-mail, including subject; text of the body; links in the email, and any others is then stored as a separate field.
  • Analogous information may also be categorized from emails that are deleted as “not spam”. This provides a database of characteristics that are likely to represent spam messages, and other characteristics that are less likely to represent spam messages. Matching with the databases changes the scoring of the message accordingly.
  • Once the database becomes sufficiently large, it may become time-consuming to compare incoming messages with the database. Indexing approaches can be used to increase the speed of the comparison. The detailed comparison may also be done in the background; the message may be displayed, and its classification displayed only some time later.
  • FIG. 4 shows incoming messages received at 400 being broken down into analogous parts to those parts that are cataloged in the database 410. Each part in the incoming e-mail is compared with each part in the database. A simplified index can be prepared, such as the type used for internet searching, in order to speed up the searching. Each match changes the scoring of the email, to make it more likely to be spam, or less likely to be spam at 415. Each field match has a specified score increase. For example, match from the addressee is a very powerful indication of spam, and may by itself carry a score of 75. 100% matching of a sentence may carry a score of 10. A 50% word match may carry a score of 3. Match of the hyperlinks in an e-mail to those in a previously spammed determined e-mail may carry a score of 5.
  • Similarly, the e-mail and its fields can be compared with non-spam indicative email. An e-mail which is not spam can carry negative scores, for example. Finding the e-mail address to be on the non-spam list, for example, can carry a score of negative 100, or can immediately abort the process with an indication of non-spam.
  • If a message has few matches to the database, it may be characterized as unknown or cautionary (yellow). Similarly, mixed signals (some matches to spam and non-spam database), may result in an unknown result.
  • The total score for an e-mail is assessed, and this total score is used to assess if the e-mail is spam or not. If the e-mail is determined to be spam, then it is appropriately processed.
  • Many different rules databases can be used.
  • Such modifications are intended to be encompassed.

Claims (16)

1. A method, comprising:
using a computer for obtaining a message;
monitoring a user's actions on said computer with respect to said message;
using the user's actions for automatically forming rules indicating the desirability of said message;
analyzing parts of said message, where said parts include at least at least one hyperlink within said message, and at least information within said message indicative of an internet address within said message; and
said computer using said rules to assess desirability of other messages.
2. A method as in claim 1, wherein said other messages are electronic mail messages.
3. A method as in claim 1 wherein said user's actions include a specific way that the user deletes said message using the computer.
4. A method as in claim 1 wherein said desirability comprises whether said message is a spam e-mail message.
5. A method, comprising:
using a computer for determining a plurality of characteristics of an unwanted electronic message;
using the computer for forming a list with said plurality of characteristics;
using the computer for receiving an incoming electronic message different than said unwanted electronic message, and forming a score of the incoming message by comparing said incoming message with said list and determining commonalities between said incoming message and said list, wherein said comparing comprises determining a domain of the sender, and comparing said domain of the sender with information about spam messages in the list, to obtain a higher probability of spam when information about all senders from a specific domain in said database represent spam, and to represent a lower probability of spam when some senders from said domain in said database represent spam and other senders from said domain in said database do not represent spam;
using the computer for defining said incoming message as likely being spam if said score is within a predetermined range; and
using the computer for taking an action to restrict said message based on said defining.
6. A method as in claim 5, wherein said comparing also comprises determining hyperlinks within said electronic message, and comprises comparing said hyperlinks with hyperlinks within said list.
7. A computer product, comprising a processor and memory, and executable instructions that are adapted to be executed to implement a filter for an e-mail program, said product comprising:
an email receiving part that receives information indicative of emails that have been received and determines domain information from a sender of one of said emails;
a storage part that stores information indicative of emails that are known to represent spam, said storage part storing domain information for multiple of said emails that are known to be spam; and
a comparing part, that compares said received emails to said known spam emails, and determines that a received email represents spam responsive to said domain information in said storage part matching to said domain information from said email receiving part.
8. A product as in claim 7, wherein said comparing part determines a received email as being likely to more likely to represent spam when said domain information matches to an uncommon domain, and as being less likely to represent spam with said domain information matches to a less common domain
9. A product as in claim 8, where said comparing part determines a number of hits to the domain to determine whether the domain is a common domain.
10. A product as in claim 7, wherein said comparing part also determining hyperlinks within said electronic message, and comprises comparing said hyperlinks with hyperlinks within said list.
11. A program as in claim 7, further comprising a display output which displays a likelihood of spam coefficient which indicates a numerical percentage likelihood that message specific email represents spam.
12. A computer product, comprising a processor and memory, and executable instructions that are adapted to be executed to implement a filter for an e-mail program, said product comprising:
an email receiving part that receives information indicative of emails that have been received and determines information about said emails, one part of said information including information about hyperlinks within said emails that have been received;
a storage part that stores information indicative of emails that are known to represent spam, said storage part storing information about hyperlinks that are within at least one of said emails that are known to be spam; and
a comparing part, that compares said received emails to said known spam emails, and determines that a received email represents spam responsive to said hyperlinks in said storage part matching to said hyperlinks from said email receiving part.
13. A product as in claim 12, wherein said comparing part also determines that a received email represents spam responsive to said domain information in said storage part matching to said domain information from said email receiving part.
14. A product as in claim 13, wherein said comparing part determines a received email as being likely to more likely to represent spam when said domain information matches to an uncommon domain, and as being less likely to represent spam with said domain information matches to a less common domain
15. A product as in claim 13, where said comparing part determines number of hits to the domain to determine whether the domain is a common domain.
16. A program as in claim 12, further comprising a display output which displays a likelihood of spam coefficient which indicates a numerical percentage likelihood that message specific email represents spam.
US12/708,681 2000-05-12 2010-02-19 Automatic Mail Rejection Feature Abandoned US20100153381A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/708,681 US20100153381A1 (en) 2000-05-12 2010-02-19 Automatic Mail Rejection Feature

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US20372900P 2000-05-12 2000-05-12
US09/690,002 US7707252B1 (en) 2000-05-12 2000-10-16 Automatic mail rejection feature
US12/708,681 US20100153381A1 (en) 2000-05-12 2010-02-19 Automatic Mail Rejection Feature

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/690,002 Division US7707252B1 (en) 2000-05-12 2000-10-16 Automatic mail rejection feature

Publications (1)

Publication Number Publication Date
US20100153381A1 true US20100153381A1 (en) 2010-06-17

Family

ID=42112580

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/690,002 Expired - Fee Related US7707252B1 (en) 2000-05-12 2000-10-16 Automatic mail rejection feature
US12/708,681 Abandoned US20100153381A1 (en) 2000-05-12 2010-02-19 Automatic Mail Rejection Feature

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/690,002 Expired - Fee Related US7707252B1 (en) 2000-05-12 2000-10-16 Automatic mail rejection feature

Country Status (1)

Country Link
US (2) US7707252B1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050204006A1 (en) * 2004-03-12 2005-09-15 Purcell Sean E. Message junk rating interface
US20050223074A1 (en) * 2004-03-31 2005-10-06 Morris Robert P System and method for providing user selectable electronic message action choices and processing
US20090034527A1 (en) * 2005-04-13 2009-02-05 Bertrand Mathieu Method of combating the sending of unsolicited voice information
CN108763449A (en) * 2018-05-28 2018-11-06 华南理工大学 A kind of Chinese key rule generating method of Spam filtering
US10333974B2 (en) * 2017-08-03 2019-06-25 Bank Of America Corporation Automated processing of suspicious emails submitted for review

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7490128B1 (en) * 2002-09-09 2009-02-10 Engate Technology Corporation Unsolicited message rejecting communications processor
US7546348B2 (en) * 2003-05-05 2009-06-09 Sonicwall, Inc. Message handling with selective user participation
US7620690B1 (en) * 2003-11-20 2009-11-17 Lashback, LLC Privacy control system for electronic communication
US8805996B1 (en) * 2009-02-23 2014-08-12 Symantec Corporation Analysis of communications in social networks
US8959157B2 (en) * 2009-06-26 2015-02-17 Microsoft Corporation Real-time spam look-up system
KR102599102B1 (en) * 2016-11-11 2023-11-06 엘지전자 주식회사 Laundry Treating Apparatus
US11050698B1 (en) * 2020-09-18 2021-06-29 Area 1 Security, Inc. Message processing system with business email compromise detection

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999932A (en) * 1998-01-13 1999-12-07 Bright Light Technologies, Inc. System and method for filtering unsolicited electronic mail messages using data matching and heuristic processing
US6161130A (en) * 1998-06-23 2000-12-12 Microsoft Corporation Technique which utilizes a probabilistic classifier to detect "junk" e-mail by automatically updating a training and re-training the classifier based on the updated training set
US6311190B1 (en) * 1999-02-02 2001-10-30 Harris Interactive Inc. System for conducting surveys in different languages over a network with survey voter registration
US6421709B1 (en) * 1997-12-22 2002-07-16 Accepted Marketing, Inc. E-mail filter and method thereof
US6453327B1 (en) * 1996-06-10 2002-09-17 Sun Microsystems, Inc. Method and apparatus for identifying and discarding junk electronic mail
US6493007B1 (en) * 1998-07-15 2002-12-10 Stephen Y. Pang Method and device for removing junk e-mail messages
US6507866B1 (en) * 1999-07-19 2003-01-14 At&T Wireless Services, Inc. E-mail usage pattern detection
US20030135555A1 (en) * 1997-06-16 2003-07-17 Digital Equipment Corporation Web-Based Electronic Mail Server Apparatus and Method Using Full Text and Label Indexing
US6601066B1 (en) * 1999-12-17 2003-07-29 General Electric Company Method and system for verifying hyperlinks
US6615242B1 (en) * 1998-12-28 2003-09-02 At&T Corp. Automatic uniform resource locator-based message filter
US6732149B1 (en) * 1999-04-09 2004-05-04 International Business Machines Corporation System and method for hindering undesired transmission or receipt of electronic messages
US6779021B1 (en) * 2000-07-28 2004-08-17 International Business Machines Corporation Method and system for predicting and managing undesirable electronic mail
US20060059238A1 (en) * 2004-05-29 2006-03-16 Slater Charles S Monitoring the flow of messages received at a server
US7117358B2 (en) * 1997-07-24 2006-10-03 Tumbleweed Communications Corp. Method and system for filtering communication
US7421472B1 (en) * 1999-11-19 2008-09-02 Ross Jr Robert C System, method, and computer program product for providing a multi-user e-mail system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5826022A (en) 1996-04-05 1998-10-20 Sun Microsystems, Inc. Method and apparatus for receiving electronic mail
US5742769A (en) 1996-05-06 1998-04-21 Banyan Systems, Inc. Directory with options for access to and display of email addresses
US5835722A (en) 1996-06-27 1998-11-10 Logon Data Corporation System to control content and prohibit certain interactive attempts by a person using a personal computer
US5930479A (en) 1996-10-21 1999-07-27 At&T Corp Communications addressing system
JPH1115756A (en) * 1997-06-24 1999-01-22 Omron Corp Electronic mail discrimination method, device, therefor and storage medium
US6249805B1 (en) * 1997-08-12 2001-06-19 Micron Electronics, Inc. Method and system for filtering unauthorized electronic mail messages
US6393465B2 (en) * 1997-11-25 2002-05-21 Nixmail Corporation Junk electronic mail detector and eliminator
US6023723A (en) * 1997-12-22 2000-02-08 Accepted Marketing, Inc. Method and system for filtering unwanted junk e-mail utilizing a plurality of filtering mechanisms
US6167434A (en) * 1998-07-15 2000-12-26 Pang; Stephen Y. Computer code for removing junk e-mail messages
US6874023B1 (en) * 1998-11-10 2005-03-29 Claria Corporation Web based email control center for monitoring and providing a sumary of the detected event information organized according to relationships between the user and network sites
US20030097361A1 (en) * 1998-12-07 2003-05-22 Dinh Truong T Message center based desktop systems
US8533038B2 (en) * 1999-05-21 2013-09-10 International Business Machines Corporation Offer delivery system
US6701346B1 (en) * 1999-07-12 2004-03-02 Micron Technology, Inc. Managing redundant electronic messages
US6707472B1 (en) * 1999-10-18 2004-03-16 Thomas Grauman Method of graphically formatting e-mail message headers
US7162437B2 (en) * 2000-01-06 2007-01-09 Drugstore.Com, Inc. Method and apparatus for improving on-line purchasing
US6725228B1 (en) * 2000-10-31 2004-04-20 David Morley Clark System for managing and organizing stored electronic messages
EP1360597A4 (en) * 2001-02-15 2005-09-28 Suffix Mail Inc E-mail messaging system
US6769016B2 (en) * 2001-07-26 2004-07-27 Networks Associates Technology, Inc. Intelligent SPAM detection system using an updateable neural analysis engine

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453327B1 (en) * 1996-06-10 2002-09-17 Sun Microsystems, Inc. Method and apparatus for identifying and discarding junk electronic mail
US20030135555A1 (en) * 1997-06-16 2003-07-17 Digital Equipment Corporation Web-Based Electronic Mail Server Apparatus and Method Using Full Text and Label Indexing
US7117358B2 (en) * 1997-07-24 2006-10-03 Tumbleweed Communications Corp. Method and system for filtering communication
US6421709B1 (en) * 1997-12-22 2002-07-16 Accepted Marketing, Inc. E-mail filter and method thereof
US5999932A (en) * 1998-01-13 1999-12-07 Bright Light Technologies, Inc. System and method for filtering unsolicited electronic mail messages using data matching and heuristic processing
US6161130A (en) * 1998-06-23 2000-12-12 Microsoft Corporation Technique which utilizes a probabilistic classifier to detect "junk" e-mail by automatically updating a training and re-training the classifier based on the updated training set
US6493007B1 (en) * 1998-07-15 2002-12-10 Stephen Y. Pang Method and device for removing junk e-mail messages
US6615242B1 (en) * 1998-12-28 2003-09-02 At&T Corp. Automatic uniform resource locator-based message filter
US6311190B1 (en) * 1999-02-02 2001-10-30 Harris Interactive Inc. System for conducting surveys in different languages over a network with survey voter registration
US6732149B1 (en) * 1999-04-09 2004-05-04 International Business Machines Corporation System and method for hindering undesired transmission or receipt of electronic messages
US6507866B1 (en) * 1999-07-19 2003-01-14 At&T Wireless Services, Inc. E-mail usage pattern detection
US7421472B1 (en) * 1999-11-19 2008-09-02 Ross Jr Robert C System, method, and computer program product for providing a multi-user e-mail system
US6601066B1 (en) * 1999-12-17 2003-07-29 General Electric Company Method and system for verifying hyperlinks
US6779021B1 (en) * 2000-07-28 2004-08-17 International Business Machines Corporation Method and system for predicting and managing undesirable electronic mail
US20060059238A1 (en) * 2004-05-29 2006-03-16 Slater Charles S Monitoring the flow of messages received at a server

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050204006A1 (en) * 2004-03-12 2005-09-15 Purcell Sean E. Message junk rating interface
US20050223074A1 (en) * 2004-03-31 2005-10-06 Morris Robert P System and method for providing user selectable electronic message action choices and processing
US20090034527A1 (en) * 2005-04-13 2009-02-05 Bertrand Mathieu Method of combating the sending of unsolicited voice information
US10333974B2 (en) * 2017-08-03 2019-06-25 Bank Of America Corporation Automated processing of suspicious emails submitted for review
CN108763449A (en) * 2018-05-28 2018-11-06 华南理工大学 A kind of Chinese key rule generating method of Spam filtering

Also Published As

Publication number Publication date
US7707252B1 (en) 2010-04-27

Similar Documents

Publication Publication Date Title
US20100153381A1 (en) Automatic Mail Rejection Feature
US10284506B2 (en) Displaying conversations in a conversation-based email system
US7895279B2 (en) Threaded presentation of electronic mail
US9734216B2 (en) Systems and methods for re-ranking displayed conversations
US9602456B2 (en) Systems and methods for applying user actions to conversation messages
US7707261B1 (en) Identification and filtration of digital communications
US6925605B2 (en) Collating table for email
US9699129B1 (en) System and method for increasing email productivity
US20030195937A1 (en) Intelligent message screening
KR20060136476A (en) Displaying conversations in a conversation-based email system
JP2000172586A (en) Mail sorting method and system therefor and recording medium

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION