Fixing the Popup UI Issue
My app is already live on the App Store, and my main focus now is improving the user experience and fixing some bugs. One issue had been bothering me for a long time: when I added many elements to a page, scrolling up and down became inevitable. But whenever I opened a popup in this state, the entire page would still scroll, which looked really awkward.
I once tried disabling scrolling in JavaScript when the popup was open. It worked, but then the popup itself couldn't scroll either. After consulting AI, I finally learned that this was happening because all of my pages were injected into the index using Shadow DOM. The only feasible solution was to place the popup in the main document instead of inside the Shadow DOM.
That sounded simple, but it turned out to be a huge task. I had to migrate CSS styles into JavaScript and also refactor parts of the JavaScript architecture. Still, I wasn't afraid—because I had AI. With its help, the changes were completed quickly. A few bugs did show up along the way, but once I pointed them out, AI helped me fix them just as fast.
Adding Photo Upload Feature
My team suggested adding a photo upload feature to my app, allowing users to submit images such as bleeding spots or medical cases. Initially, I was hesitant due to concerns about limited server storage. However, after checking, I found that only 15% of the 40GB disk space was used, so I decided to implement it.
The main challenge was how to store these images. Although MySQL supports storing images, my data is primarily stored in JSON files, making direct storage cumbersome. I came up with a solution: encode the images in Base64 and include them as part of the JSON data. On the frontend, the images are displayed using HTML <img> tags.
This approach introduces another issue: Base64-encoded images are roughly 33% larger than regular binary images, which could create transmission problems if the JSON file becomes too large. To address this, I implemented a size limit of 5MB per JSON file in JavaScript, triggering an error if exceeded, and compress each image to around 500KB. This effectively prevents overly large JSON files.
Additionally, I integrated Capacitor's camera plugin, allowing the app to directly access the device camera. Finally, I implemented the Base64 decoding logic on the main pages, completing the photo upload and display functionality.
Adding Calendar Feature for Symptom Visualization
After some optimization and interface improvements, I needed to implement a new feature. I wanted to create a calendar interface that would allow users to visually see their symptom patterns throughout the month using different colors. The biggest challenge was how to store symptom data and retrieve it efficiently for the calendar page.
Initially, I considered using the large JSON files that were saved and uploaded from the metric page. However, I discovered that these files were too large, especially when they contained photo information, making calendar page retrieval extremely slow.
I came up with a solution: when saving metric data, I would also add a separate table to the database with a similar structure, but in the JSON file I would only store numeric codes representing symptom conditions. For example, 0 represents no symptoms, 1 = cutaneous purpura, 2 = articular purpura.
The database structure I implemented:
CREATE TABLE IF NOT EXISTS symptom_files (
    id VARCHAR(64) PRIMARY KEY,           -- Unique identifier
    user_id VARCHAR(128) NULL,            -- User ID
    username VARCHAR(128) NULL,            -- Username
    file_name VARCHAR(255) NOT NULL,      -- File name
    content LONGTEXT NOT NULL,            -- JSON content (main data)
    created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,  -- Creation time
    INDEX idx_user_id (user_id),          -- User ID index
    INDEX idx_username (username),        -- Username index
    INDEX idx_created_at (created_at)      -- Creation time index
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
          
          The JSON structure stored in the content field:
{
  "exportInfo": {
    "exportTime": "2024-01-15 14:30:25",
    "recordTime": "2024-01-15 14:30",
    "version": "1.0",
    "appName": "紫癜精灵",
    "dataType": "symptom_tracking"
  },
  "symptomData": {
    "symptoms": [1, 2, 3]  // Array of numeric codes
  }
}
          
          In the calendar page, by reading data using user ID and month, users can understand their symptom patterns for that month, and display them with different colors on the calendar. This approach significantly improved performance while maintaining data integrity and providing users with an intuitive way to track their health patterns over time.
Optimizing Photo Storage Performance
I discovered that the previous method of storing images through Base64 encoding had serious performance issues. Every time data was read, these massive image files had to be downloaded, causing extremely slow application response times. After careful consideration, I decided to use the file system to store images, saving only the image access links in the database. Taking the diet page as an example, the optimized JSON data structure is as follows:
{
  "exportInfo": {
    "exportTime": "2024-01-15 14:30:25",
    "recordTime": "2024-01-15 12:30:00",
    "version": "1.0",
    "appName": "紫癜精灵",
    "dataType": "diet_record"
  },
  "dietData": {
    "meal_1": {
      "time": "12:30",
      "food": "米饭、青菜、鸡肉",
      "mealId": 1,
      "images": ["https://app.zdelf.cn/uploads/diet_image_123.jpg"],
      "date": "2024-01-15",
      "timestamp": "2024-01-15 12:30:00"
    },
    "meal_2": {
      "time": "18:00",
      "food": "面条、蔬菜汤",
      "mealId": 2,
      "images": [],
      "date": "2024-01-15",
      "timestamp": "2024-01-15 18:00:00"
    }
  }
}
          
          The images field now stores URLs to access the images instead of Base64-encoded data. This approach dramatically increased the speed of reading the database, as the application no longer needs to download large image files every time data is accessed. Images are now loaded on-demand, significantly improving the user experience.
Fixing Time Zone Issues
I also discovered another critical issue: the time was consistently inaccurate. After consulting with AI, I learned that the application wasn't using China's timezone. To ensure time accuracy, I implemented a solution using Capacitor's geolocation plugin to read the user's location and determine the timezone based on their geographical position.
Here's the code for obtaining the user's location:
async _getUserLocation() {
    return new Promise((resolve, reject) => {
        navigator.geolocation.getCurrentPosition(
            (position) => {
                const location = {
                    latitude: position.coords.latitude,
                    longitude: position.coords.longitude,
                    accuracy: position.coords.accuracy
                };
                resolve(location);
            },
            (error) => {
                console.warn('⚠️ 无法获取用户位置:', error.message);
                resolve(null);
            },
            {
                enableHighAccuracy: true,
                timeout: 10000,
                maximumAge: 300000 // 5分钟缓存
            }
        );
    });
}
          
          I then created a mapping table to determine the timezone based on the user's location:
const timezoneMap = {
    '-12': 'Pacific/Kwajalein',
    '-11': 'Pacific/Midway',
    // ... other timezones
    '8': 'Asia/Shanghai',  // China timezone
    '9': 'Asia/Tokyo',
    // ...
};
          
          This approach effectively resolved the time accuracy issues by automatically detecting the user's timezone and applying the correct time calculations throughout the application. The timezone detection ensures that all timestamps are accurate and consistent with the user's local time.
Fixing Notification Functionality Issues
Recently, my companion has been constantly reporting issues with the notification functionality. Every time she enters the app, she experiences a bombardment of messages, even when the current time doesn't fall within any scheduled notification time. I've attempted to fix this problem multiple times, but I kept asking AI to figure out the cause on its own. Each time after an update, the problem remained unresolved.
I tried to reproduce this issue by AI but was unsuccessful. However, during one conversation when my companion was sending me messages, she revealed a crucial detail: this bug only occurs when reminder items are set to repeat, and it only manifests after several days have passed. This made me realize that the system might be sending messages, but since the app wasn't running, the app thinks the messages haven't been successfully sent. The moment the app opens, it discovers a large number of unsent messages and sends them all at once, regardless of the current time.
I thought about it and realized that solving this problem shouldn't be too difficult - I just need to delete outdated messages during the app initialization phase. Here's the code I implemented:
function catchUpOverdueReminders() {
  // When entering/resuming the page, perform "silent alignment" for overdue reminders:
  // - Don't send notifications
  // - Only advance to the next scheduled time or delete one-time reminders
  try {
    const now = new Date();
    const toDelete = [];
    reminders.forEach((reminder) => {
      if (!(reminder && reminder.dailyCount > 0 && Array.isArray(reminder.dailyTimes) && reminder.dailyTimes.length > 0)) return;
      // If past the end date, delete directly
      if (isReminderExpired(reminder, now)) {
        toDelete.push(reminder.id);
        return;
      }
      // One-time (non-repeating): if no remaining time points today, delete; otherwise advance to next time today
      if (!reminder.repeatInterval || reminder.repeatInterval === 'none') {
        const nextToday = getNextTimeToday(reminder, now);
        if (nextToday) {
          scheduleUiAdvance(reminder.id, nextToday);
        } else {
          toDelete.push(reminder.id);
        }
        return;
      }
      // Repeating: calculate the next trigger time from the current moment
      const baseDateStr = (() => {
        const y = now.getFullYear();
        const m = String(now.getMonth() + 1).padStart(2, '0');
        const d = String(now.getDate()).padStart(2, '0');
        return `${y}-${m}-${d}`;
      })();
      const nextAt = computeNextTime(reminder, new Date(`${baseDateStr}T00:00:00`), now);
      // If the next time exceeds the end date, delete; otherwise advance UI to nextAt
      if (!nextAt || isReminderExpired(reminder, nextAt)) {
        toDelete.push(reminder.id);
      } else {
        scheduleUiAdvance(reminder.id, nextAt);
      }
    });
    toDelete.forEach((rid) => { try { hardDeleteReminder(rid); } catch (_) { } });
    if (currentRoot) renderReminders(currentRoot);
    console.log('⏰ Aligned overdue reminders (silent advance/cleanup).');
  } catch (e) {
    console.warn('⏰ Failed to align overdue reminders:', e);
  }
}
          
          As expected, the problem was successfully fixed. After resolving this issue, I realized that while AI is powerful, it still cannot replace humans at this stage. Some problems still require human intervention to solve.
I also discovered another issue: if users set up recurring message sending, the app only calculates the next sending time after opening the app following a completed message send. My solution was to schedule multiple notifications in the system immediately when users set up recurring notifications, and then update the status when they enter the app. Here's the code:
function enumerateUpcomingOccurrences(reminder, fromTime, maxDays, perReminderCap) {
  const occurrences = [];
  if (!(reminder && reminder.dailyCount > 0 && Array.isArray(reminder.dailyTimes) && reminder.dailyTimes.length > 0)) return occurrences;
  const enabledTimes = [...reminder.dailyTimes].filter(Boolean).filter(t => isTimeEnabled(reminder, t)).sort();
  if (enabledTimes.length === 0) return occurrences;
  const now = new Date(fromTime);
  const startBoundary = reminder.startDate ? new Date(`${reminder.startDate}T00:00:00`) : null;
  const endBoundary = reminder.endDate ? new Date(`${reminder.endDate}T23:59:59`) : null;
  for (let dayOffset = 0; dayOffset < maxDays; dayOffset++) {
    const day = new Date(now);
    day.setHours(0, 0, 0, 0);
    day.setDate(day.getDate() + dayOffset);
    if (startBoundary && day < startBoundary) continue;
    if (endBoundary && day > endBoundary) break;
    const ymd = formatDateYMD(day);
    for (const t of enabledTimes) {
      const at = new Date(`${ymd}T${t}:00`);
      if (at <= now) continue; // Skip past moments
      if (endBoundary && at > endBoundary) continue;
      occurrences.push(at);
      if (occurrences.length >= perReminderCap) return occurrences;
    }
  }
  return occurrences;
}
          
          With these changes, the notification functionality should now be complete and working properly.
Improving Icon Accessibility
I also discovered another issue: the icons in my app were using Google's Material Design icons, which cannot be loaded if users are in China without using a VPN. I replaced these icons with Ionic icons, which can automatically adjust based on whether the user is on iOS or Android, and can be accessed directly in China.
This change significantly improved the user experience for Chinese users, ensuring that all icons display correctly regardless of network restrictions. The Ionic icon system also provides better platform-specific iconography, making the app feel more native to each operating system.
AI Reading Database and Analyzing User Behavior
I developed an AI assistant in my app, but currently it cannot analyze user-submitted data. To differentiate my AI assistant from ordinary web chatbots, I plan to enable it to read user-submitted health-related data, analyze the user's physical condition, and provide recommendations.
Initially, my idea was to have the frontend read data from the database and then send it to the AI backend for processing. However, after careful consideration, I realized this approach was very inefficient—since both the backend and database run on the same server, there's no need for the frontend to participate in data transmission and waste additional bandwidth.
So I adjusted the architecture: the frontend only sends basic user information (such as userid and username) to the backend; after receiving this information, the backend queries the user's diet, health, and medical record data from the database, integrates it, and then sends it to the large language model for analysis.
To save on Token costs, I added an "Enable Data Analysis" button on the frontend. The AI only reads and analyzes user data when users actively enable the analysis feature; if not enabled, the AI behaves like a regular chat mode.
After completing the functionality, I began testing but encountered a strange problem: in data analysis mode, whenever I switched pages after a chat ended, the chat history would disappear, and the "AI Data Analysis" button would automatically become inactive. However, strangely, even under these circumstances, the AI could still access my health data.
At first, I thought it was a backend bug—perhaps the analysis logic was still being executed even when the analysis feature wasn't activated. But after checking the code, I found this wasn't the case. It wasn't until one time when the AI's response mentioned: "Based on our previous chat content, your health data is..." that I realized what the problem was.
It turned out that even though the frontend state was reset after switching pages, it was still part of the same session. The AI's responses continued to be based on previous context for reasoning. To completely solve this problem, I made the AI page create a new sessionID every time it initializes, ensuring that each conversation is independent and no longer affected by previous chat content.
App Version and Update Check
I have completed the functionality for the square page, but I suddenly realized an issue: since my app is not published on any Android app store, how can Android users know if their app is the latest version?
My solution is to add a JSON file on the app's website that includes update information for each version. The current JSON file looks like this:
{
  "app_name": "紫癜精灵",
  "package_name": "com.junxibao.healthapp",
  "versions": [
    {
      "version": "1.2.5.2",
      "release_date": "2025-10-9",
      "changes": [
        "Fixed an issue when users cropped their avatars"
      ]
    },
    {
      "version": "1.3.0.0",
      "release_date": "2025-10-11",
      "changes": [
        "Added square page functionality",
        "Added automatic update check feature"
      ]
    }
  ]
}
        In the app, I only need to add a version number and compare the local version with the remote version from the JSON file. If the local version is older than the remote version, the app will display the update prompt along with detailed information for each version.
Thus, for future updates, I only need to update the app's local version number and modify the server JSON file. The app will then automatically check for updates without relying on any app store.
ICP Filing Challenge
One morning after waking up, my companion told me that users were reporting they couldn't find our app on the App Store. At first, I thought the users had misspelled the name and didn't pay much attention. But upon closer investigation, I discovered that the app was indeed no longer searchable on the Chinese mainland App Store. My first reaction was: could this be a bug on Apple's end? Logically, if an app gets delisted, Apple should send notifications or email reminders, but I hadn't received any messages.
When I continued checking the App Store Connect settings, I noticed a prompt in the "Countries and Regions" section—the Chinese mainland region was not ICP filed. That's when I realized the problem might be related to filing. So I immediately set out to apply for ICP filing for the app.
The entire filing process wasn't particularly complex since we already had servers and domain names. However, after submitting the materials, we were quickly rejected by Alibaba Cloud's initial review. There were two reasons:
- The reviewer believed my app was company-based and required filing under a company name, not as an individual;
 - The system incorrectly identified information from my ID card, showing "incorrect filer identity information."
 
Additionally, the reviewer specifically emphasized over the phone that the app must not contain any medical-related content. I could solve the first two problems, but this last requirement left me in a difficult position—after all, our app was specifically designed to serve patients. At that moment, I felt very frustrated and even considered giving up on filing, letting users continue with the beta version, or spending 1,000 yuan to hire Alibaba Cloud's "expert service" to handle it for me.
However, holding onto a glimmer of hope, I resubmitted the application. This time, I barely changed anything, just adding a note in the remarks: "This is a personal developer's app that does not contain any medical advice."
Unexpectedly, this time it was approved smoothly!
At this point, our app could finally be relisted on the Chinese mainland App Store.
Completing Auto Message Update Check
My teammate suggested adding an automatic "message update check" to the app—whenever someone comments on or replies to a user’s post, the system should promptly notify that user.
After researching, I found there wasn’t a straightforward plugin in framework Capacitor to handle remote push. iOS requires APNs and Android relies on Firebase—this is a bit heavy for my current stage.
So I chose a pragmatic interim approach: instead of push notifications, I implemented an in-app "Message Center" where users can see all messages relevant to them. The system also shows how many new items have appeared since their last visit.
The first part was simple—iterate the database for items related to the current user and sort them by time. The second part took a little thought. Later that night, I realized I could store the user’s last-view time in localStorage, and when loading messages, count how many relevant records in the database have timestamps later than that value. That gives me the exact number of new messages.
Once the approach was clear, implementation went smoothly and the feature was quickly completed.