Back to blog

Researchers Create Screens with Pixels That Can Be Touched and Felt

Hello HaWkers, a fascinating innovation has just come out of the labs: researchers have developed screens with pixels that not only display images but can also physically change texture, allowing you to feel what you're seeing on the screen.

Imagine touching a photo of a cat and feeling the texture of the fur, or dragging a file and feeling resistance when it passes over a folder. This is the next frontier of human-computer interfaces.

What Are Tactile Pixels?

Unlike traditional screens that only emit light, these new screens incorporate micro-actuators in each pixel that can raise or lower the surface, create localized vibrations, and simulate different textures.

Technologies involved:

  • Piezoelectric micro-actuators: Materials that change shape when electricity is applied
  • Electrorheological fluids: Liquids that change viscosity with electric fields
  • Electroactive polymers: Plastics that contract and expand
  • Micropoint networks: Matrices of tiny controllable elevations

Current technical specifications:

Characteristic Value
Tactile resolution ~1-2mm between points
Elevation height 0.1-0.5mm
Response time 10-50ms
Texture modes 10-20 distinct
Additional consumption +15-30% vs normal screen

How the Technology Works

The system combines multiple layers of technology to create the illusion of physical texture.

Layered Architecture

Display Layer (Top):
Traditional OLED or LCD screen for visual display.

Actuation Layer (Middle):
Matrix of micro-actuators that can create elevations and vibrations on the surface.

Control Layer (Bottom):
Circuits that coordinate actuators based on visual content.

Sensing Layer:
Detects where and with what pressure the user is touching to adapt feedback.

Feedback Mechanisms

1. Static Elevation:
Pixels that rise to create permanent physical textures while you touch.

  • Buttons that "pop" from the screen
  • Element edges you can feel
  • Simulated material textures

2. Localized Vibration:
Different vibration frequencies create the sensation of different materials.

  • High frequency: smooth surfaces
  • Low frequency: rough surfaces
  • Complex patterns: specific textures

3. Variable Friction:
The surface changes its "grip" as you drag your finger.

  • "Sticky" areas to indicate drop zones
  • "Slippery" surfaces for quick scroll
  • Resistance to indicate limits

💡 Technical detail: Human tactile resolution on the finger is approximately 1mm. Current screens are already approaching this resolution, allowing for convincing textures.

Revolutionary Applications

This technology opens possibilities that go far beyond smartphones and tablets.

Accessibility

Blind and Visually Impaired:

  • Dynamic Braille that changes with content
  • Graphs and maps that can be "felt"
  • Tactile navigation in interfaces
  • Confirmation feedback without looking

Elderly:

  • Larger, easier-to-locate buttons
  • Tactile action confirmation
  • Reduction of touch errors

People with Tremors:

  • Tactile guides for precision
  • Element edge feedback
  • Input stabilization

Gaming and Entertainment

Mobile Games:

  • Feel the texture of different terrains
  • Localized impact feedback
  • Controls you can find without looking
  • Enhanced gameplay immersion

E-Readers:

  • Page-turning sensation
  • Textures of different paper types
  • Illustrations that can be explored tactilely

Streaming and Video:

  • Texture sync with visual content
  • Immersive experiences in documentaries
  • Interaction with educational content

Education

Sciences:

  • Interactive anatomical models
  • Molecular structures that can be felt
  • Geographic maps with relief

Art and Design:

  • Tactile exploration of paintings and sculptures
  • Feedback for 3D design
  • Material simulation in CAD

Music:

  • Virtual keyboards with key feel
  • Faders and knobs with resistance
  • Virtual instrument feedback

Implications for Developers

This new interaction modality requires rethinking how we design interfaces.

New UX Dimension

Traditional Design:

  • Visual: color, shape, animation
  • Audio: sounds, music
  • Touch: touch location

Design with Haptics:

  • Visual: color, shape, animation
  • Audio: sounds, music
  • Touch: touch location
  • Tactile: texture, resistance, elevation

Emerging APIs and Frameworks

There are no established standards yet, but trends are emerging.

Expected concepts:

// Illustrative pseudo-code of future haptic APIs

// Define texture for an element
element.setHapticTexture({
  type: 'rough',
  intensity: 0.7,
  pattern: 'sandpaper'
});

// Button elevation
button.setElevation({
  height: 0.3, // mm
  edges: 'rounded',
  ramp: 'smooth'
});

// Feedback when dragging
dragZone.onDrag((position) => {
  if (isOverDropTarget(position)) {
    return { friction: 'sticky' };
  }
  return { friction: 'smooth' };
});

// Scroll resistance
scrollView.setResistance({
  atBounds: 'increasing',
  inertia: 0.3
});

Design Considerations

Consistency:

  • Textures should have consistent meaning
  • Buttons should always have the same basic feel
  • Error feedback should be distinctive

Subtlety:

  • Don't overdo feedback
  • Respect user preferences
  • Consider tactile fatigue

Accessibility:

  • Tactile feedback can't be sole indication
  • Combine with visual and audio
  • Allow customization

Technical and Commercial Challenges

Despite the potential, there are significant obstacles to mass adoption.

Engineering Challenges

Durability:

  • Mechanical actuators have limited lifespan
  • Moving surfaces suffer wear
  • Protection against dust and liquids is complex

Energy Consumption:

  • Actuators require significant energy
  • Mobile device batteries are already limited
  • Trade-off between haptic capability and autonomy

Production Cost:

  • Additional layers increase cost
  • Precision manufacturing required
  • Scale not yet achieved

Thickness:

  • Extra layers increase thickness
  • Current trend is thinner screens
  • Design compromise necessary

Market Challenges

Consumer Education:

  • Users need to understand the value
  • Demonstration is essential (need to touch to understand)
  • Marketing different from visual features

Content Ecosystem:

  • Apps need to support haptics
  • Content creators need tools
  • Chicken and egg cycle

Standardization:

  • Each manufacturer may implement differently
  • Developers need abstraction
  • Inconsistent experience between devices

Adoption Timeline

When can we expect to see this technology in daily life?

Short Term (2025-2027)

Specialized devices:

  • Tablets for designers
  • Accessibility devices
  • Premium gaming controllers
  • Smartphone prototypes

Characteristics:

  • Limited tactile resolution
  • High cost ($500+ premium)
  • Enthusiast niche

Medium Term (2028-2032)

Premium mainstream:

  • Flagship smartphones
  • Mainstream tablets
  • High-end laptops
  • Advanced wearables

Characteristics:

  • Improved tactile resolution
  • Moderate cost ($100-200 premium)
  • Growing app ecosystem

Long Term (2033+)

Expected standard:

  • All touch devices
  • Smart surfaces (tables, walls)
  • Vehicles (panels and controls)
  • Appliances

Characteristics:

  • Resolution close to human perception
  • Marginal cost
  • Standardized APIs

What to Do Now

For developers interested in preparing for this new era of interfaces:

1. Study Haptic Design:
Familiarize yourself with existing haptic feedback principles in current devices (smartphone vibrations).

2. Think Multi-Modal:
Design interfaces that already coherently combine visual, audio, and touch.

3. Consider Accessibility:
Accessible interfaces today will be easier to adapt for haptics tomorrow.

4. Follow Hardware:
Follow announcements from screen and device manufacturers about haptic innovations.

5. Experiment with Prototypes:
When available, experiment with devices with advanced haptics to understand the experience.

If you're interested in how interfaces are evolving, I recommend checking out another article: Google Confirms AI Glasses with Gemini for 2026 where you'll discover how spatial computing is redefining how we interact with technology.

Let's go! 🦅

Comments (0)

This article has no comments yet 😢. Be the first! 🚀🦅

Add comments