Code works on simulator, not on micro-bit

I have code in block form that iis supposed to turn on the entire array of LEDs, then off four times.
Then if I press A it is supposed to exercise the different i/o pins then count the number of times that another piece of code fails to read the individual combination that was written. On the simulator it never fails, but on the micro bit, the numbers that it is supposed to scroll turn into a few individual LEDs and do not scroll.
Hmm I think that I answered my own question - I am operating the I/O pins and I/O pin 7 and pin 11 might be on or off - I need to clear them to scroll the number.

An update: I wish to use the micro-bit to operate a simple lifting mechanism. The design has the micro-bit plugged into a pcb that has tri-state capability for the I/O pins to the micro-bit. But - I use two of the I/O pins to operate the tri-state interface. The two I/O pins are affecting the micro-bit LEDs even when the micro-bit is standing alone, ie, not connected to anything. Have I merely assigned the wrong I/O pins to this task or is the tri-state idea a no-go from the git-go on micro-bit?
In the current design, I am using I/O 8 and I/O 16 to toggle the tri-state to high Z. They are connected (when plugged-in) to the input of an op-amp for which the output controls the state of the output on the tri-state devices (two modules of 8 devices each with the state control input on each module controlled by the output of the op amp on one of the two I/O pins, but only one control input is needed to switch the tri-states from on/off states to the high Z state where the output is not seen by the I/O pin connected to it on the micro-bit.)
So I’m wondering if the treatment of the I/O pins on the micro-bit precludes such functionality between these tri-states and the micro-bit? When I program the micro-bit and use the programming to modify any of the pins, it seems to go into some kind of mode where several of the LEDs are on at a low brightness level even though nothing is connected to any of its pins (running using power from the USB port connected to my pc).
I was successful in operating the LEDs when the hex file had no instructions to write on the I/O pins, but I did not determine precisely what combinations of reading or writing I/O pins was affecting the issues I was experiencing.- only that the issues went away when the same code was devoid of writing to or reading from the I/O pins. And it happened when the micro-bit was not connected to anything else other than the USB for writing the hex file and providing power to the micro-bit.
Any suggestions? .

It may help to look at the schematics and pin assignments for the micro:bits:

There are links to schematics for each version of the micro:bit (v1.5, v2). The LEDs require a row and column pin. For version 1.5 it may help to look at Page 5 of, which shows the microcontroller pin number, like P.04, the number on the micro:bit edge connector, which is 8, and line connecting it to the LED grid, which is COL1. (That is, P.04 on the microcontroller is considered to be P8 in MakeCode, and is connected to COL1 of the LED array).

You can look at the schematic of your particular model of micro:bit to try to select pins that aren’t already allocated for another use.

I hope that helps!

Thanks for responding, but, No, I’m afraid that you are reading that somewhat wrong. All of the I/O pins remained the same between V1 and V2 except for pin 9 and the names have changed on the LEDCOLn pins. And I have watched the simulator run the code; the pins that it turns on are the correct ones. I did look at the schematics, though I had not early on, rather looked at the pin outs without looking at the specs, so at first I did not pick up on the multiple uses of ‘General Purpose I/O’ pins that have special purpose but refuse to work correctly. They are billed as I/O but then have warnings about their multi-use. They ought to have a programmable interface that changes the function of the pin rather than hardwired to confusion so that only the secondary feature may be used safely which is the case with pin 15, 14, 13 12, 11, 9, 7, 6, 5, 4 and 3. How crazy is that? That leaves only pins 0, 1, 2, 8, and 16 which are available for I/O - just 5 lines. But 10 lines that are more or less dedicated to the LEDs and other on-board features such as SPI 1 - a tremendously limiting fete - especially if the processor cannot shut the feature off when it is not used which appears to be the case in this situation.
But be that is at may be, that was not my question. The question is why is the software on the simulator ok with what the code tells it to do, but Micro-bit is not. On top of that, Micro-bit seems to object strenuously to the code modifying pins even if they are unused pins like P8 and P16. And if the code changes one of them, the LEDs ‘show’ function works wrong.
The other part of my question was about how I had planned to get around this profundity of featurability that pervades Micro-bit’s implementation (what good is it to have these features if it kills other features and forces exception cases all over the place?) - anyway, I have designed my board to interface with it using tri-state buffers. I have 9 switches and 2 control lines plus three relays that I want it to watch and determine which switches are on (position in the cycle), what relays must be on or not so that the mechanism operates smoothly under control of the two control inputs. This could take as many as 9 I/Os for the switches, two for the controls and three more for the relays. Plus the two I/Os for the control of the disconnect of those I/Os from the switches, control lines and relay actuators ostensibly so that I could make it signal something using the LEDs such as ‘Error 24’ or ‘Parked’, etc. Instead it greets me with gibberish on the display or no display at all - unless I disallow all ‘pin’ OPs even with the Micro-bit standing alone, in which case the pins 8 and 16 are not supposed to be connected to anything.
I will get back to working on the code soon to try to discover if it is different than the description, but so far, i think I am using the pins as described and expected them to work as described.
I was hoping that someone would give me a definitive answer regarding tri-states on the I/O lines - are they precluded in some way. Is there any way to turn off the features so that I/O for them can be repurposed for a time. If so, how?

I should have mentioned that I used to determine what the pins do - on page 3 of that output the statement says “pin 16: Dedicated GPIO (conventionally also used for SPI ‘Chip Select’ function)” whatever that means. If you look at the lists there are very few pins that do not have multiple usage mentioned - one of them is pin 8. And pin 16 is only a chip select option for SPI. Since the pin will normally be off except when I want the tri-states to be off-line, I was hoping they would not bother. I use pin 0 and pin 1 as my control lines; pin 2, pin 7 and pin 11 operate my relays through a op amp to a driver which can handle up to about 1 amp; pins 3-6, pins 9, 13, and 14 are the switch inputs (again all inputs are from tri-state buffer for input and outputs go through an op-amp. None of these are connected when the problem was experienced.

Yeah…I made a silly mistake. The numbers on the schematic are not edge connector numbers (I’m assuming they are the pin numbers on the micro). I should have looked at the edge connector connections elsewhere in the schematic too. The two pin maps at are better. My previous example should have been: P0.04 (micro) is P3 (edge) and COL1 (onboard) on the V1.

It may help if you provide more details, like a link to the code you’re using for testing and a description of exactly what’s going wrong (i.e., the code that’s causing LEDs to behave unusually when you interact with GPIO).

I don’t think there’s a reason tri-states wouldn’t work on GPIO pins without other I/O, but you may need to change the default pin settings (drive mode / pulls).