Jump to content
Sign in to follow this  
mozobata

Why does output voltage increase when pwm input frequency to output enable increase?

Recommended Posts

I'm trying to pulse-width modulate an output from a 74HC585 shift register.

To achieve this, I plugged a PWM output from a teensy 3.1 microcontroler into the output enable input of the 74HC595. The A output of the 74HC595 is then connected to the ground via a 640 Ohm resistor. Voltage between VCC and ground is 3.3V.

Then I use the microcontroler to shift 0b00000001 to the 74HC595 and I start to PWM the output enable input at 50% duty cycle.

 

Here is the datesheet of 74hc595

I expect to measure about 1.6V voltage between output A of the 74HC595 and ground. And indeed, if the PWM carrier frequency is slow (100 Hz), that's what I observe.

Problem is, when I try to increase the PWM carrier frequency, the voltage between A and ground increase. For example, I measure 2.7V for a 10 KHz frequency. I measured the voltage between the teensy pwm output and ground, and it is as expected : 1.6 V.

So, I know ICs can't be fed arbitrary high frequencies, but I was under the impression that 10 KHz doesn't qualify as high frequency.

I seem to be unable to understand the problem, so here I am : can anyone explain me the reasons of this behavior ?

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

×
×
  • Create New...