Hi Everyone
I just started using Microsoft Power BI recently, and this got stucked for few hours already. Hope someone could help me on this. This might be a simple question to the gurus.
Here are the scenario -
I have
a slicer, and its data table (Table 1) has two columns (Index, Name).
a table, and its data table (Table 2) has thre columns (Id, Name, Applicabilities). The applicabilities is a 1 and 0 based 200 characters long string.
Backgroun:
The Table 1's index match the Table 2's applicabilities's characters' position.
What I want to achieve is following.
After user selected an item from the slicer, Microsoft BI takes this selected index (name it as P), and show the Table2 rows if the character at the position P of the current row's applicabilites string equals to
What I had tried so far.
1. I created a measure (SelectedIndex = MIN(TABLE[INDEX)) in Table 1 to get the selected Index
2. Create a column (Applicable = MID( [APPLICABILITIES], [SelectedIndex], 1) ) in Table 2 to find the character from Table 2's applicabiliteis string based on the measure (SelectedIndex)
However, the SelectedIndex always return smallest index of the Table 1's entire dataset.
Any help would be very much appreciated ,