Disorientator
Member
NOTE: Even though these patent applications have just been published in the West, they are based on Japanese applications filed with the Japan Patent office in September 2014, so the are not exactly "fresh". Nevertheless, as the capabilities of Nintendo's next systems are still unknown, it doesn't hurt to see what they've been up to.
- Like in the past, the assignee/applicant for the two of the three patent applications is not Nintendo but some of the inventors.
- 1st patent application was published on March 3rd while the other two today.
- 1st one explains how objects/user's hand gets detected, while the other two seem to expand on the concept.
- 3rd one is basically the same as the 2nd one - just emphasizes on the fact that gestures/distance can trigger vibration and sound output.
- It doesn't have to mean something, but I find it interesting that this is the second time we see Nintendo looking for alternatives to external inputs - remember their other patent application featuring outward-facing linear image sensors?
Summary:
- Infrared camera and distance sensor
- Gesture recognition
- Distance measuring
- One example includes projector and GPS receiver
United States Patent Application 20160063711
Filed: 07/15/2015
Published: 03/03/2016
Title:
Abstract:
United States Patent Application 20160073033 & United States Patent Application 20160073017
Filed: 09/01/2015
Published: 03/10/2016
Title:
Abstract: (for 20160073033)
Abstract: (for 20160073017)
- Like in the past, the assignee/applicant for the two of the three patent applications is not Nintendo but some of the inventors.
- 1st patent application was published on March 3rd while the other two today.
- 1st one explains how objects/user's hand gets detected, while the other two seem to expand on the concept.
- 3rd one is basically the same as the 2nd one - just emphasizes on the fact that gestures/distance can trigger vibration and sound output.
- It doesn't have to mean something, but I find it interesting that this is the second time we see Nintendo looking for alternatives to external inputs - remember their other patent application featuring outward-facing linear image sensors?
Summary:
- Infrared camera and distance sensor
- Gesture recognition
- Distance measuring
- One example includes projector and GPS receiver
United States Patent Application 20160063711
Filed: 07/15/2015
Published: 03/03/2016
Title:
NON-TRANSITORY STORAGE MEDIUM ENCODED WITH COMPUTER READABLE IMAGE PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD
Abstract:
A non-transitory storage medium encoded with a computer readable image processing program is provided. The image processing program causes a computer to perform functionality including obtaining an infrared image of a subject and determining a position of the subject included in the infrared image based on brightness and a size of a prescribed region of the subject within the infrared image.
United States Patent Application 20160073033 & United States Patent Application 20160073017
Filed: 09/01/2015
Published: 03/10/2016
Title:
ELECTRONIC APPARATUS
Abstract: (for 20160073033)
An exemplary portable electronic apparatus includes a display provided at a front surface, and an infrared camera and a distance measuring sensor which are provided at a side surface. When a user holds the portable electronic apparatus with their left hand and makes a gesture with their right hand, the portable electronic apparatus analyzes an image from the infrared camera to detect the gesture made by the user. The portable electronic apparatus displays an image corresponding to a result of detection of the gesture on the display.
Abstract: (for 20160073017)
An exemplary portable electronic apparatus includes a display provided at a front surface, an infrared camera and a distance measuring sensor which are provided at a side surface, and a vibrator. When a user holds the portable electronic apparatus with their left hand and makes a gesture with their right hand, the portable electronic apparatus analyzes an image from the infrared camera to detect the gesture made by the user. The portable electronic apparatus causes the vibrator to vibrate in accordance with a result of detection of the gesture.
As shown in FIG. 1, the portable electronic apparatus 1 includes a display 2, a touch panel 3, an infrared camera 4, a distance measuring sensor 5, an input button 6 (6A to 6D), an irradiation section 7, and a projector 8, and these components are housed in a housing 10. The housing 10 (the portable electronic apparatus 1) has a plate-like shape and has a size small enough to be held with one hand or both hands of the user.
An outer camera 9 is provided at a back surface (T6 surface) of the portable electronic apparatus 1 (FIG. 3). The outer camera 9 is typically capable of capturing an image in a direction perpendicular to the imaging direction of the infrared camera 4, that is, in a direction perpendicular to the back surface. The outer camera 9 includes a lens and an image sensor which senses visible light. The outer camera 9 captures an image of a space in a back surface direction as a color image (RGB image). A camera may be provided at the front surface in addition to the outer camera 9 at the back surface. The outer camera 9 at the back surface may not be provided, and a camera may be provided at the front surface (the surface at which the screen of the display 2 is provided).
FIG. 4 is a block diagram showing an example of the internal configuration of the portable electronic apparatus 1. As shown in FIG. 4, in addition to each section described above, the portable electronic apparatus 1 includes a vibrator 11, a microphone 12, a speaker 13, a control section 14, a communication section 15, an attitude detection section 16, a GPS receiver 17, and a geomagnetic sensor 18. In addition, the portable electronic apparatus 1 includes a battery which is not shown, and is supplied with power from the battery. These respective sections are housed in the housing 10.
FIGS. 10A to 10D are each a diagram showing an example of a process based on a gesture input using the infrared camera 4. As shown in FIG. 10A, when the user makes their right hand into rock at a predetermined position in the right side surface direction of the portable electronic apparatus 1 while holding the portable electronic apparatus 1 with their left hand, a character 21 is displayed on the display 2. As shown in FIG. 10B, when, from the state of FIG. 10A, the user moves their right hand toward the right side surface of the portable electronic apparatus 1 while their right hand is kept as rock, the character 21 displayed on the display 2 is enlarged.
Furthermore, when, from the state shown in FIG. 10B, the user makes their right hand into paper, the facial expression of the character 21 on the display 2 changes (FIG. 10C). For example, as shown in FIG. 10C, the character 21 changes to have a smile expression. Moreover, as shown in FIG. 10D, when the user moves their right hand toward the portable electronic apparatus 1 while their right hand is kept as paper, the displayed character 21 is further enlarged, and the facial expression of the character 21 changes. At that time, the vibrator 11 may operate to vibrate the portable electronic apparatus 1, and a sound may be outputted from the speaker 13.
FIGS. 12A and 12B are each a diagram showing another example of a process based on a gesture input using the infrared camera 4. In this example, a calculation question is displayed on the display 2. The user answers the question displayed on the display 2, by using their right hand. For example, when a character string 1+4=? is displayed as shown in FIG. 12A, if the user performs, by using their right hand, a gesture input (a gesture of paper) indicating 5 which is the correct answer of this question, a display indicating that the answer is correct is performed on the display 2. At that time, a sound indicating that the answer is correct may be outputted, or the vibrator 11 may be operated in a pattern indicating that the answer is correct, to vibrate the portable electronic apparatus 1.
When a character string 1+1=? is displayed as shown in FIG. 12B, if the user performs, by using their right hand, a gesture input (an input of 5 by a gesture) different from the correct answer of this question, a display indicating the answer is incorrect is performed on the display 2. At that time, a sound indicating that the answer is incorrect may be outputted, or the vibrator 11 may be operated in a pattern indicating that the answer is incorrect.
Next, an example of a process using the distance measuring sensor 5 will be described. FIG. 13 is a diagram showing the example of the process using the distance measuring sensor 5. As shown in FIG. 13, when the user puts their right hand at a predetermined position in the right side surface direction of the portable electronic apparatus 1 while holding the portable electronic apparatus 1 with their left hand, a snake character 22 appears from the right edge of the display 2. When the user moves their right hand toward the portable electronic apparatus 1, the snake character 22 extends leftward from the right edge of the display 2. Specifically, the portable electronic apparatus 1 calculates the distance between the portable electronic apparatus 1 and the right hand of the user on the basis of information from the distance measuring sensor 5, and sets a length of the snake character 22 in accordance with the calculated distance. Then, the portable electronic apparatus 1 displays the snake character 22 on the display 2. In addition, depending on the distance, the portable electronic apparatus 1 may operate the vibrator 11 to vibrate the portable electronic apparatus 1.
Next, an example of output using the projector 8 will be described. FIG. 14 is a diagram showing a state where an image corresponding to a gesture input performed by the user is projected onto the hand of the user by using the projector 8. As shown in the upper part of FIG. 14, for example, when the user makes their right hand into paper at a predetermined position in the right side surface direction of the portable electronic apparatus 1, the character 21 is projected onto the right hand of the user. Specifically, the portable electronic apparatus 1 analyzes an input image captured by the infrared camera 4, thereby identifying the type of the gesture input performed by the user. Next, the portable electronic apparatus 1 generates an image of the character 21 on the basis of the identified gesture input, and outputs the image of the character 21 by using the projector 8. At that time, the same (or a different) image may be outputted also to the display 2.
As shown in (A) of FIG. 22, a batter is displayed on the display 2. When the user makes a gesture of paper with their right hand, the portable electronic apparatus 1 recognizes the gesture on the basis of an image from the infrared camera 4, and projects an image of a ball onto the right hand of the user by using the projector 8. Next, when the user makes a gesture of swinging their right hand, the image of the ball projected on the right hand of the user disappears, and the ball appears from the right edge of the display 2 and moves in the leftward direction of the screen ((B) of FIG. 22). When the user presses the input button 6B at the timing at which the ball reaches a predetermined range of the batter, the batter hits back the ball ((C) of FIG. 22). The hit-back ball moves in the rightward direction of the screen. The user makes a gesture of rock at predetermined timing (timing at which the hit-back ball reaches the right hand) after the ball reaches the right edge of the display 2. Accordingly, an image of the ball is projected onto the right hand of the user as if the user caught the hit-back ball ((D) of FIG. 22). At that time, characters OUT indicating that the user is successful in catching the ball may be displayed on the display 2, and a sound may be outputted, or the portable electronic apparatus 1 may vibrate. The portable electronic apparatus 1 may recognize the position of the right hand of the user (a position on a plane parallel to the right side surface of the portable electronic apparatus, and/or a position in a direction perpendicular to the right side surface) on the basis of an image from the infrared camera 4 or a distance measured by the distance measuring sensor 5, and may project an image of the ball onto the right hand of the user in accordance with a result of the recognition. When the user does not make a gesture of rock at the above-described predetermined timing or the timing at which the gesture of rock is made deviates from the above-described predetermined timing, the user fails in catching the ball, and the image of the ball as shown in (D) of FIG. 22 is not projected onto the right hand. For example, an image different from the ball may be projected onto the right hand, or characters ERROR or HIT may be displayed on the display 2. When a gesture made by the user cannot be recognized with the infrared camera 4, the same process as described above may be performed by detecting movement of the hand with the distance measuring sensor 5